What is SPEC/GPC and what does it do?
SPEC/GPC is a non-profit organization that sponsors the development of
standardized, application-based benchmarks that have value to the vendor,
research and user communities. For more, see: http://www.spec.org/gpc/publish/overview.htm.
What benchmarking
projects are active under SPEC/GPC?
The OpenGL Performance Characterization (SPECopcSM)
group, begun in 1993, establishes graphics performance benchmarks for
systems running under the OpenGL application programming interface (API).
The group's SPECviewperf® benchmark is the most popular standardized
software worldwide for evaluating performance based on CAD/CAM, digital
content creation, and visualization applications. For more, see http://www.spec.org/gpc/opc.static/overview.htm.
The Application Performance
Characterization (SPECapcSM) group was
formed in 1997 to provide a broad-ranging set of standardized benchmarks
for graphics-intensive applications. For more, see http://www.spec.org/gpc/apc.static/apcfaq.htm.
Why are some of
the same applications (Pro/E, 3ds max, UGS) included in both SPECapc and
SPECviewperf benchmark suites?
The two benchmark suites have different purposes and different types of
users. SPECapc benchmarks are designed to measure, as much as possible,
total performance for graphics-intensive applications. They typically
include tests for graphics, I/O and CPU performance, and they require
that the user has a license for the application on which they are based.
SPECapc benchmarks are based on large models and complex interactions,
and tend to take a long time to run.
Viewsets, the benchmarks
that run on SPECviewperf, exercise only the graphics functionality of
the application. Because it strips away application overhead, SPECviewperf
allows direct performance comparisons of graphics hardware. SPECviewperf
does not require users to have licenses of the applications on which its
viewsets are based. This makes it more accessible to a wider range of
users. SPECviewperf is also easier to use and faster to run than SPECapc
benchmarks.
How can someone
run SPECviewperf and/or SPECapc benchmarks and submit results for review
and publication on the GPC News web site?
SPEC/GPC provides a wide range of plans to allow those who are not members
of the SPECopc or SPECapc project groups to submit results for publication
on this web site. For more information, see http://www.spec.org/gpc/publish/nonmember.html.
Whether submitted
for publication on the GPC News site or not, anyone publishing
results for SPEC/GPC benchmarks must comply with the benchmark license
and run rules.
I cannot find
benchmark results on the GPC News site for a vendor or systems
configuration that interests me. How can I get the results I'm seeking?
Submitting benchmark results for publication on the GPC
News web site is voluntary. If you are seeking specific results that
are not published on the site, you can try the following:
- Contact SPECopc
<[email protected]>
to inquire about SPECviewperf results or SPECapc <[email protected]>
to ask about application benchmark results. If the vendor is a member
of the appropriate group, a representative should be able to answer
your question, and perhaps even provide some results.
- Conduct a web search
to see if any of the major publications - such as PC Magazine
- that use SPEC/GPC benchmarks have published the test results you are
seeking.
- If you have a
customer service contact for the hardware vendor or ISV, relay your
request to him or her.
- If it is feasible,
run your own benchmark tests using a SPECapc benchmark or SPECviewperf.
Who do I contact
if I have trouble running SPECviewperf or a SPECapc benchmark?
Contact SPECopc <[email protected]>
for problems with SPECviewperf or SPECapc <[email protected]>
for problems with application-based benchmarks.
How do I get my
benchmark considered for adoption by SPECopc or SPECapc?
Send a description of the benchmark and links to information and/or downloads
to the appropriate e-mail alias above.
Why should I trust
results from a vendor-sponsored benchmark organization? Isn't this a bit
like the fox guarding the chicken coop?
Industry vendors have the highest level of interest in developing credible
benchmarks. Without good performance evaluation software, vendors would
not be able to do valid system comparisons when developing new products,
or gain recognition from the trade media and public for significant technology
advances.
Members of SPECopc
and SPECapc do not publish benchmarks in a void - they develop the benchmarks
based on interaction with user groups, publications, application developers
and others. Benchmarks go through testing from different vendors working
on different operating systems and environments before they are released.
Contrary to some beliefs,
"vendor-driven" benchmarks are probably the most objective,
as they are not subject to personal biases. The competitive nature of
vendors provides a natural system of checks and balances that helps ensure
objective, repeatable benchmarks.
Have a SPEC/GPC
question you want answered? Submit it to [email protected].
|