SPEC MPI® L2007 Results -- Help
Help for SPEC MPI® L2007 Results
This is a powerful engine for fetching results from SPEC.
There are two interfaces to this engine:
- Simple Interface
- The simple interface has a very limited interface
and relies heavily on default settings.
It can handle many basic inquires with minimal fuss.
- Configurable Interface
- The configurable interface offers a lot more functionality.
With this interface, it is possible to:
- select which columns you wish to see,
- limit the records returned by regular expressions and/or
numeric criteria over multiple columns
- specify a multiple key sort ordering,
- choose which quarterly publications to pull results from,
- and pick one of three formats for the results display.
Most of this help information is for the configurable interface.
These are the fields available
in the current configuration (mpil2007).
Each configuration is likely to have
a different set of fields available.
[Note: there may be multiple configurations,
each with different fields, for any set of results.
Typically, the more specific a configuration is
to a particular benchmark,
the more fields that are available.]
- Benchmark
- Which benchmark is this a result for.
- Hardware Vendor
- The hardware vendor for the system under test
- System
- The name of the system tested.
- Result (Peak)
- The single-figure-of-merit summary metric.
- Result (Base)
- The baseline (less aggressive) summary metric.
- Compute Cores Enabled
- The total number of cores enabled in the system.
- # Chips
- The total number of chips in the system.
- Compute Threads Enabled
- The number of compute threads enabled
- Compute Nodes Used
- The total number of compute nodes used
- Memory
- The total amount of main memory in the system under test.
- C Compiler
- The name and version of the C compiler (and associated software) used.
- C++ Compiler
- The name and version of the C++ compiler (and associated software) used.
- Fortran Compiler
- The name and version of the Fortran compiler (and associated software) used.
- MPI Library
- The name and version of the MPI library used.
- HW Avail
- The date that the hardware for this system is/will be generally available.
- SW Avail
- The date that the software used for this result is/will be generally available.
- System Class
- System class: homogeneous/heterogeneous
- Base Ranks
- The number of base MPI ranks used.
- Max Peak Ranks
- The maximum number of peak MPI ranks used.
- Min Peak Ranks
- The minimum number of peak MPI ranks used.
- 121.pop2 Peak
- The ratio for the 121.pop2 benchmark.
- 121.pop2 Base
- The baseline ratio for the 121.pop2 benchmark.
- 122.tachyon Peak
- The ratio for the 122.tachyon benchmark.
- 122.tachyon Base
- The baseline ratio for the 122.tachyon benchmark.
- 125.RAxML Peak
- The ratio for the 125.RAxML benchmark.
- 125.RAxML Base
- The baseline ratio for the 125.RAxML benchmark.
- 126.lammps Peak
- The ratio for the 126.lammps benchmark.
- 126.lammps Base
- The baseline ratio for the 126.lammps benchmark.
- 128.GAPgeofem Peak
- The ratio for the 128.GAPgeofem benchmark.
- 128.GAPgeofem Base
- The baseline ratio for the 128.GAPgeofem benchmark.
- 129.tera_tf Peak
- The ratio for the 129.tera_tf benchmark.
- 129.tera_tf Base
- The baseline ratio for the 129.tera_tf benchmark.
- 132.zeusmp2 Peak
- The ratio for the 132.zeusmp2 benchmark.
- 132.zeusmp2 Base
- The baseline ratio for the 132.zeusmp2 benchmark.
- 137.lu Peak
- The ratio for the 137.lu benchmark.
- 137.lu Base
- The baseline ratio for the 137.lu benchmark.
- 142.dmilc Peak
- The ratio for the 142.dmilc benchmark.
- 142.dmilc Base
- The baseline ratio for the 142.dmilc benchmark.
- 143.dleslie Peak
- The ratio for the 143.dleslie benchmark.
- 143.dleslie Base
- The baseline ratio for the 143.dleslie benchmark.
- 145.lGemsFDTD Peak
- The ratio for the 145.lGemsFDTD benchmark.
- 145.lGemsFDTD Base
- The baseline ratio for the 145.lGemsFDTD benchmark.
- 147.l2wrf2 Peak
- The ratio for the 147.l2wrf2 benchmark.
- 147.l2wrf2 Base
- The baseline ratio for the 147.l2wrf2 benchmark.
- License
- The number of the license used to generate this result.
- Tested By
- The people who have produced this result.
- Test Sponsor
- The name of the organization or individual that sponsored the test.
- Test Date
- When this result was measured.
- Published
- The date this result was first published by SPEC.
- Updated
- The date this result was last updated by SPEC, though most updates are clerical rather than significant.
- Disclosure
- Full disclosure report including all the gory details.
[Note: there are two kinds of query forms:
Simple and
Configurable.
Most of the features described here are available
only in the configurable query.]
- Content: Display
- What fields to display.
Each field can be
Display
which will display the entire field,
or SKIP
which will cause
the field not to be displayed.
For fields of the string type,
it is also possible to limit the width of the field's display
by choosing one of the X Chars
options.
- Content: Criteria
- Limit results to only those that satisfy some criteria.
For each field it is possible to specify some criteria
that will be used to select only certain records
out of the entire dataset.
String criteria can be regular expressions,
numeric fields are compared against your provided
floating point values,
and date fields are compared against
the specified month and year.
You may specify criteria for and and all fields,
whether or not that field will be displayed.
- Content: Duplicates
- Allows the removal of duplicates, such as
where there are multiple results for the same configuration.
Duplicates are defined to be records that all have
matching values in across a specified set of fields.
Duplicates are then ranked according to
their values in a specified key field.
There are three possible actions for duplicates:
return all records (the default),
return the one result with the latest (or greatest) value,
return the one result with the earliest (or smallest) value.
- Content: Publication
- Specify in which datasets to look for results.
All SPEC results are published on a quarterly basis,
This allows you to specify the range of quarters
that you are interested in.
Note: there are some quarters where
no results were published for certain benchmarks;
datasets which would have no available results
will not be present in the selection list.
The default settings are for all quarters to be searched.
- Sorting: Column
- This search engine returns its findings in sorted order,
this ordering is based upon any three keys:
a
primary
, a secondary
,
and if records are still even, a tertiary
key
is used to settle ties.
- Sorting: Direction
- For each sort column, you must specify a direction.
Ascending
means that the list starts at
the lowest value ("AA", or "0", or "Jan-80"), and
Descending
starts the list
from the highest values.
- Format: Output Format
- Results may be returned in one of three formats.
HTML3.2 Table
- Which uses HTML table specifications
which allows your browser to arrange the display.
Preformatted Text
- Which makes the server format
the display of the data returned.
This is most useful when a large number of fields
are to be returned because most browsers do not perform
well when there are a large number of columns in a table.
Comma Separated Values
(CSV)
- Which may not look pretty, but if saved to a local file,
can are easily loaded into any spreadsheet application
and you can arrange and format and calculate to your heart's
delight
This search engine is designed to be controlled by two basic parts:
the configuration used, and the datasets searched.
The configuration controls many of the aspects of this engine.
It specifies which datasets are appropriate,
and which views of those datasets are supported.
The datasets contain the available data for published results.
SPEC breaks its publications into quarterly 'buckets';
thus there is a different dataset for each calendar quarter.
This allows you to select how far back into history you want to go.
If you want only the last year,
specify a range covering the last four quarters;
if you want to know about results performed
during the last half of 1995,
you may specify the range covering
the September and December issues in '95; etc.
The default settings cover all available quarters.
There may be multiple configurations for the same datasets.
Typically, the more focused a configuration is
towards a particular benchmark,
the more information about each result is available.
In other words, the summary configuration views
commonly support only the highest level information about a system
and its result;
the more specific configuration would support columns including
system configuration details,
the specific software versions,
and/or individual component benchmark results.
Finally, most configurations support links to the reporting page
for each result as the last column of the data returned.
These reporting pages (available in a variety of formats),
contain the full disclosure for each particular result.
Consult these pages to learn all the details about a result.
This engine supports five different modes of operation:
Help
- The current mode, what you are looking at right now.
Displays the available help information about the engine
and descriptions of the fields in the current configuration.
Simple
- The starting interface.
Offers a simple form for obtaining results
using mostly default settings.
Form
- The configurable interface.
Offers a very configurable interface to the available results.
Fetch
- The main workhorse.
Takes the configuration and settings from the
simple
and form
interfaces,
and performs the desired lookups and displays the results.
Dump
- Brute force.
Changes settings to return all available data
and then calls
fetch
.
Returns all data in the current configuration;
more data may be available in other configurations,
and all the details are in each result's disclosure page.
Because this is usually more data than browsers can handle
as tables, these dumps are available in only two forms:
preformated text
, which can be easily scrolled,
and comma separated values
, which can be saved locally
and loaded into a spreadsheet application.
Further Assistance
If you have comments or questions not addressed here,
please
contact the SPEC office.
Goto:
[Home]
[SPEC]
[OSG]
[HPG]
[ISG]
[EG]
[GPC]
[Benchmarks]
[Results]
[Submitting OSG Results]
[Submitting HPG Results]
webmaster@spec.org
Thu Nov 28 04:40:48 2024