SPEC/OSG Result Search Engine

Available Configurations:    
Search Form Request:     Simple  Advanced

SPEC CPU® 2017 Integer Rate Results -- Help


Help for SPEC CPU® 2017 Integer Rate Results

This is a powerful engine for fetching results from SPEC.

There are two interfaces to this engine:

Simple Interface
The simple interface has a very limited interface and relies heavily on default settings. It can handle many basic inquires with minimal fuss.
Configurable Interface
The configurable interface offers a lot more functionality. With this interface, it is possible to:

Most of this help information is for the configurable interface.

Dataset Fields

These are the fields available in the current configuration (rint2017). Each configuration is likely to have a different set of fields available. [Note: there may be multiple configurations, each with different fields, for any set of results. Typically, the more specific a configuration is to a particular benchmark, the more fields that are available.]

Benchmark
Which benchmark is this a result for.

Hardware Vendor
The hardware vendor for the system under test

System
What is the name of the system tested.

# Cores
The number of cores in the system.

# Chips
The number of chips in the system.

# Enabled Threads Per Core
The number of enabled threads per processor core

Processor
The name and speed of the central processor(s).

Processor MHz
The advertised speed (MHz) of the central processor(s).

CPU(s) Orderable
The valid number of CPU(s) orderable.

Parallel
Whether multiple threads were employed by a parallelizing compiler

Base Pointer Size
Whether benchmarks in base used 32- or 64-bit pointers or a mixture

Peak Pointer Size
Whether benchmarks in peak used 32- or 64-bit pointers or a mixture

1st Level Cache
The size and structure of the first level cache.

2nd Level Cache
The size and structure of the second level cache.

3rd Level Cache
The size and structure of the third level cache.

Other Cache
The size and structure of any other levels of cache.

Memory
The amount of main memory (in MB) in the system under test.

Storage
The type and size of the storage used in the system under test.

Operating System
The name and version of the operating system running on the system under test.

File System
The type of file system used to hold the benchmark tree on the system under test.

Compiler
The name and version of the compiler (and associated software) used.

HW Avail
The date that the hardware for this system is/will be generally available.

SW Avail
The date that the software used for this result is/will be generally available.

Base Copies
The number of copies of the benchmark run simultaneously.

Result
The single-figure-of-merit summary metric.

Baseline
The baseline (less aggressive) summary metric.

Energy Peak Result
Overall energy ratio running N integer programs (tester chooses N), peak tuning.

Energy Base Result
Overall energy ratio running N integer programs (tester chooses N), base tuning.

500 Peak
The ratio for the 500.perlbench_r benchmark.

500 Base
The baseline ratio for the 500.perlbench_r benchmark.

502 Peak
The ratio for the 502.gcc_r benchmark.

502 Base
The baseline ratio for the 502.gcc_r benchmark.

505 Peak
The ratio for the 505.mcf_r benchmark.

505 Base
The baseline ratio for the 505.mcf_r benchmark.

520 Peak
The ratio for the 520.omnetpp_r benchmark.

520 Base
The baseline ratio for the 520.omnetpp_r benchmark.

523 Peak
The ratio for the 523.xalancbmk_r benchmark.

523 Base
The baseline ratio for the 523.xalancbmk_r benchmark.

525 Peak
The ratio for the 525.x264_r benchmark.

525 Base
The baseline ratio for the 525.x264_r benchmark.

531 Peak
The ratio for the 531.deepsjeng_r benchmark.

531 Base
The baseline ratio for the 531.deepsjeng_r benchmark.

541 Peak
The ratio for the 541.leela_r benchmark.

541 Base
The baseline ratio for the 541.leela_r benchmark.

548 Peak
The ratio for the 548.exchange2_r benchmark.

548 Base
The baseline ratio for the 548.exchange2_r benchmark.

557 Peak
The ratio for the 557.xz_r benchmark.

557 Base
The baseline ratio for the 557.xz_r benchmark.

License
The number of the license used to generate this result.

Tested By
The people who have produced this result.

Test Sponsor
The name of the organization or individual that sponsored the test.

Test Date
When this result was measured.

Published
The date this result was first published by SPEC.

Updated
The date this result was last updated by SPEC, though most updates are clerical rather than significant.

Disclosure
Full disclosure report including all the gory details.

Query Features

[Note: there are two kinds of query forms: Simple and Configurable. Most of the features described here are available only in the configurable query.]

Content: Display
What fields to display. Each field can be Display which will display the entire field, or SKIP which will cause the field not to be displayed. For fields of the string type, it is also possible to limit the width of the field's display by choosing one of the X Chars options.

Content: Criteria
Limit results to only those that satisfy some criteria. For each field it is possible to specify some criteria that will be used to select only certain records out of the entire dataset. String criteria can be regular expressions, numeric fields are compared against your provided floating point values, and date fields are compared against the specified month and year. You may specify criteria for and and all fields, whether or not that field will be displayed.

Content: Duplicates
Allows the removal of duplicates, such as where there are multiple results for the same configuration. Duplicates are defined to be records that all have matching values in across a specified set of fields. Duplicates are then ranked according to their values in a specified key field. There are three possible actions for duplicates: return all records (the default), return the one result with the latest (or greatest) value, return the one result with the earliest (or smallest) value.

Content: Publication
Specify in which datasets to look for results. All SPEC results are published on a quarterly basis, This allows you to specify the range of quarters that you are interested in. Note: there are some quarters where no results were published for certain benchmarks; datasets which would have no available results will not be present in the selection list. The default settings are for all quarters to be searched.

Sorting: Column
This search engine returns its findings in sorted order, this ordering is based upon any three keys: a primary, a secondary, and if records are still even, a tertiary key is used to settle ties.

Sorting: Direction
For each sort column, you must specify a direction. Ascending means that the list starts at the lowest value ("AA", or "0", or "Jan-80"), and Descending starts the list from the highest values.

Format: Output Format
Results may be returned in one of three formats.
HTML3.2 Table
Which uses HTML table specifications which allows your browser to arrange the display.
Preformatted Text
Which makes the server format the display of the data returned. This is most useful when a large number of fields are to be returned because most browsers do not perform well when there are a large number of columns in a table.
Comma Separated Values (CSV)
Which may not look pretty, but if saved to a local file, can are easily loaded into any spreadsheet application and you can arrange and format and calculate to your heart's delight

Engine Basics

This search engine is designed to be controlled by two basic parts: the configuration used, and the datasets searched.

The configuration controls many of the aspects of this engine. It specifies which datasets are appropriate, and which views of those datasets are supported.

The datasets contain the available data for published results. SPEC breaks its publications into quarterly 'buckets'; thus there is a different dataset for each calendar quarter. This allows you to select how far back into history you want to go. If you want only the last year, specify a range covering the last four quarters; if you want to know about results performed during the last half of 1995, you may specify the range covering the September and December issues in '95; etc. The default settings cover all available quarters.

There may be multiple configurations for the same datasets. Typically, the more focused a configuration is towards a particular benchmark, the more information about each result is available. In other words, the summary configuration views commonly support only the highest level information about a system and its result; the more specific configuration would support columns including system configuration details, the specific software versions, and/or individual component benchmark results.

Finally, most configurations support links to the reporting page for each result as the last column of the data returned. These reporting pages (available in a variety of formats), contain the full disclosure for each particular result. Consult these pages to learn all the details about a result.

Engine Modes

This engine supports five different modes of operation:
Help
The current mode, what you are looking at right now. Displays the available help information about the engine and descriptions of the fields in the current configuration.
Simple
The starting interface. Offers a simple form for obtaining results using mostly default settings.
Form
The configurable interface. Offers a very configurable interface to the available results.
Fetch
The main workhorse. Takes the configuration and settings from the simple and form interfaces, and performs the desired lookups and displays the results.
Dump
Brute force. Changes settings to return all available data and then calls fetch. Returns all data in the current configuration; more data may be available in other configurations, and all the details are in each result's disclosure page. Because this is usually more data than browsers can handle as tables, these dumps are available in only two forms: preformated text, which can be easily scrolled, and comma separated values, which can be saved locally and loaded into a spreadsheet application.

Further Assistance

If you have comments or questions not addressed here, please contact the SPEC office.


Goto: [Home] [SPEC] [OSG] [HPG] [ISG] [EG] [GPC] [Benchmarks] [Results] [Submitting OSG Results] [Submitting HPG Results]

webmaster@spec.org
Thu Nov 28 04:44:04 2024