SPEC/OSG Result Search Engine

Available Configurations:    
Search Form Request:     Simple  Advanced

SPECint® 2006 Results -- Help


Help for SPECint® 2006 Results

This is a powerful engine for fetching results from SPEC.

There are two interfaces to this engine:

Simple Interface
The simple interface has a very limited interface and relies heavily on default settings. It can handle many basic inquires with minimal fuss.
Configurable Interface
The configurable interface offers a lot more functionality. With this interface, it is possible to:

Most of this help information is for the configurable interface.

Dataset Fields

These are the fields available in the current configuration (cint2006). Each configuration is likely to have a different set of fields available. [Note: there may be multiple configurations, each with different fields, for any set of results. Typically, the more specific a configuration is to a particular benchmark, the more fields that are available.]

Benchmark
Which benchmark is this a result for.

Hardware Vendor
The hardware vendor for the system under test

System
What is the name of the system tested.

# Cores
The number of cores in the system.

# Chips
The number of chips in the system.

# Cores Per Chip
The number of cores per chip in the system.

# Threads Per Core
The number of threads per processor core

Processor
The name and speed of the central processor

Processor MHz
The speed (MHz) of the central processor(s).

Processor Characteristics
Technical characteristics to help identify the processor.

CPU(s) Orderable
The valid number of CPU(s) orderable.

Auto Parallelization
Whether multiple threads were employed by a parallelizing compiler

Base Pointer Size
Whether benchmarks in base used 32- or 64-bit pointers or a mixture

Peak Pointer Size
Whether benchmarks in peak used 32- or 64-bit pointers or a mixture

1st Level Cache
The size and structure of the first level cache.

2nd Level Cache
The size and structure of the second level cache.

3rd Level Cache
The size and structure of the third level cache.

Other Cache
The size and structure of any other levels of cache.

Memory
The amount of main memory (in MB) in the system under test.

Operating System
The name and version of the operating system running on the system under test.

File System
The type of file system used to hold the benchmark tree on the system under test.

Compiler
The name and version of the compiler (and associated software) used.

HW Avail
The date that the hardware for this system is/will be generally available.

SW Avail
The date that the software used for this result is/will be generally available.

Result
The single-figure-of-merit summary metric.

Baseline
The baseline (less aggressive) summary metric.

400 Peak
The ratio for the 400.perlbench benchmark.

400 Base
The baseline ratio for the 400.perlbench benchmark.

401 Peak
The ratio for the 401.bzip2 benchmark.

401 Base
The baseline ratio for the 401.bzip2 benchmark.

403 Peak
The ratio for the 403.gcc benchmark.

403 Base
The baseline ratio for the 403.gcc benchmark.

429 Peak
The ratio for the 429.mcf benchmark.

429 Base
The baseline ratio for the 429.mcf benchmark.

445 Peak
The ratio for the 445.gobmk benchmark.

445 Base
The baseline ratio for the 445.gobmk benchmark.

456 Peak
The ratio for the 456.hmmer benchmark.

456 Base
The baseline ratio for the 456.hmmer benchmark.

458 Peak
The ratio for the 458.sjeng benchmark.

458 Base
The baseline ratio for the 458.sjeng benchmark.

462 Peak
The ratio for the 462.libquantum benchmark.

462 Base
The baseline ratio for the 462.libquantum benchmark.

464 Peak
The ratio for the 464.h264ref benchmark.

464 Base
The baseline ratio for the 464.h264ref benchmark.

471 Peak
The ratio for the 471.omnetpp benchmark.

471 Base
The baseline ratio for the 471.omnetpp benchmark.

473 Peak
The ratio for the 473.astar benchmark.

473 Base
The baseline ratio for the 473.astar benchmark.

483 Peak
The ratio for the 483.xalancbmk benchmark.

483 Base
The baseline ratio for the 483.xalancbmk benchmark.

License
The number of the license used to generate this result.

Tested By
The people who have produced this result.

Test Sponsor
The name of the organization or individual that sponsored the test.

Test Date
When this result was measured.

Published
The date that this result was first published by SPEC.

Updated
The date that this result was last updated by SPEC, though most updates are clerical rather than significant.

Disclosure
Full disclosure report including all the gory details.

Query Features

[Note: there are two kinds of query forms: Simple and Configurable. Most of the features described here are available only in the configurable query.]

Content: Display
What fields to display. Each field can be Display which will display the entire field, or SKIP which will cause the field not to be displayed. For fields of the string type, it is also possible to limit the width of the field's display by choosing one of the X Chars options.

Content: Criteria
Limit results to only those that satisfy some criteria. For each field it is possible to specify some criteria that will be used to select only certain records out of the entire dataset. String criteria can be regular expressions, numeric fields are compared against your provided floating point values, and date fields are compared against the specified month and year. You may specify criteria for and and all fields, whether or not that field will be displayed.

Content: Duplicates
Allows the removal of duplicates, such as where there are multiple results for the same configuration. Duplicates are defined to be records that all have matching values in across a specified set of fields. Duplicates are then ranked according to their values in a specified key field. There are three possible actions for duplicates: return all records (the default), return the one result with the latest (or greatest) value, return the one result with the earliest (or smallest) value.

Content: Publication
Specify in which datasets to look for results. All SPEC results are published on a quarterly basis, This allows you to specify the range of quarters that you are interested in. Note: there are some quarters where no results were published for certain benchmarks; datasets which would have no available results will not be present in the selection list. The default settings are for all quarters to be searched.

Sorting: Column
This search engine returns its findings in sorted order, this ordering is based upon any three keys: a primary, a secondary, and if records are still even, a tertiary key is used to settle ties.

Sorting: Direction
For each sort column, you must specify a direction. Ascending means that the list starts at the lowest value ("AA", or "0", or "Jan-80"), and Descending starts the list from the highest values.

Format: Output Format
Results may be returned in one of three formats.
HTML3.2 Table
Which uses HTML table specifications which allows your browser to arrange the display.
Preformatted Text
Which makes the server format the display of the data returned. This is most useful when a large number of fields are to be returned because most browsers do not perform well when there are a large number of columns in a table.
Comma Separated Values (CSV)
Which may not look pretty, but if saved to a local file, can are easily loaded into any spreadsheet application and you can arrange and format and calculate to your heart's delight

Engine Basics

This search engine is designed to be controlled by two basic parts: the configuration used, and the datasets searched.

The configuration controls many of the aspects of this engine. It specifies which datasets are appropriate, and which views of those datasets are supported.

The datasets contain the available data for published results. SPEC breaks its publications into quarterly 'buckets'; thus there is a different dataset for each calendar quarter. This allows you to select how far back into history you want to go. If you want only the last year, specify a range covering the last four quarters; if you want to know about results performed during the last half of 1995, you may specify the range covering the September and December issues in '95; etc. The default settings cover all available quarters.

There may be multiple configurations for the same datasets. Typically, the more focused a configuration is towards a particular benchmark, the more information about each result is available. In other words, the summary configuration views commonly support only the highest level information about a system and its result; the more specific configuration would support columns including system configuration details, the specific software versions, and/or individual component benchmark results.

Finally, most configurations support links to the reporting page for each result as the last column of the data returned. These reporting pages (available in a variety of formats), contain the full disclosure for each particular result. Consult these pages to learn all the details about a result.

Engine Modes

This engine supports five different modes of operation:
Help
The current mode, what you are looking at right now. Displays the available help information about the engine and descriptions of the fields in the current configuration.
Simple
The starting interface. Offers a simple form for obtaining results using mostly default settings.
Form
The configurable interface. Offers a very configurable interface to the available results.
Fetch
The main workhorse. Takes the configuration and settings from the simple and form interfaces, and performs the desired lookups and displays the results.
Dump
Brute force. Changes settings to return all available data and then calls fetch. Returns all data in the current configuration; more data may be available in other configurations, and all the details are in each result's disclosure page. Because this is usually more data than browsers can handle as tables, these dumps are available in only two forms: preformated text, which can be easily scrolled, and comma separated values, which can be saved locally and loaded into a spreadsheet application.

Further Assistance

If you have comments or questions not addressed here, please contact the SPEC office.


Goto: [Home] [SPEC] [OSG] [HPG] [ISG] [EG] [GPC] [Benchmarks] [Results] [Submitting OSG Results] [Submitting HPG Results]

webmaster@spec.org
Thu Nov 28 03:42:39 2024