[comp.benchmarks] l/m 3/1/91 benchmark info sources under construction

eugene@nas.nasa.gov (Eugene N. Miya) (03/02/91)

[WARNING ROCKY ROAD AHEAD -- UNDER CONSTRUCTION reduce MFLOPS ahead.]

You will typically ask, and I (amelia) must try to answer
	1) where can you pick up source code?
	2) where can you get results?
This list can start to answer 1).  2) is a more difficult
proposition, and I can't begin to describe the problems.  A few sources
in 1) can answer 2).  As you will see the problem isn't obtaining
information, it's sorting it out, and learning something meaningful.

PERFECT
(post-facto acronynm:
PERFormance Evaluation for Cost-effective Transformations)
Contact:	Lynn Pointer <lpointer@uicsrd.csrd.uiuc.edu>
		Center for Supercomputing Research and Development
		305 Talbot Laboratory
		104 South Wright Street
		Urbana, Illinois 61801
		(217) 244-0042


NISTLIB/NBSLIB
nistlib@cmr.ncsl.nist.gov
	send index
includes LLNL loops (aka Livermore Loops, Livermore Fortrasn Kernels (LFK),
	NAS kernels, Los Alamos benchmarks, parcbench, whetstone, dhrystone,
	mendez, and many others
no results


NETLIB benchmark index (Linpack benchmark)
netlib@ornl.gov
	send index from benchmark
includes linpack (100x100,300x300,1000x1000)


SPEC
[Systems Performance Evaluation Cooperative]
c/o Waterside Associates
39138 Fremont Blvd.
Fremont, CA  94538
415/792-3334
SPEC membership costs:

	Initiation	$10,000
	Annual Dues	$ 5,000

SPEC Associate:

	Initiation	$2,500
	Annual Dues	$1,000

To qualify as a SPEC associate, you must be an accredited educational
institution or a non-profit organization.  An associate has no voting
privileges.  An associate will receive the newsletter and the benchmark
tapes as they are available.  In addition, an associate will have early
access to benchmarks under development so that an associate may act in an
advisory capacity to SPEC.

The SPEC tape still costs $699 which includes the cost of a 1 year
subscription to the SPEC newsletter.  The tape by itself costs $300.  
There are no discounts for the SPEC tape/newsletter.  

Performance mailing list: (oriented toward performance analysis
using quantitative modeling (Mean-Value Analysis [MVA], etc.)
perform-request@vuse.vanderbilt.edu

ACM/SIGMETRICS (Perforance Evaluation Review)

Other benchmarks:
iostone:
	park@iris.ucdavis.edu
iocall:
	I have two versions. (Will substitute developer net address soon)
iobench:

bonnie:
	I have a version of this as well.
slalom:
Archive-directory: tantalus.al.iastate.edu:/pub/Slalom/ [129.186.200.15]

%A J. Gustafson
%A D. Rover
%A S. Elbert
%A M. Carter
%T SLALOM: The First Scalable Supercomputer Benchmark
%J Supercomputing Review
%D November 1990
%P 56-61

%A John Gustafson
%A Diane Rover
%A Stephen Elbert
%A Michael Carter
%T The Design of a Scalable, Fixed-Time Computer Benchmark
%R IS-5049/UC-32
%I Ames Lab, Iowa State University
%D 1990

Tantalus.al.iastate.edu has the SLALOM sources in various languages and other
things including the latest reported results in
pub/Slalom/Reports/CurrentReport

smith:
	smith@berkeley.edu

OLTP (On-Line Transaction Processing):
	Jim Gray, Tandem
	Serlin?

Gibson mix:
	IBM

NCR benchmark:
	Have a copy (pio) might not be up to date.

nhfstone/NFSstone: (I have a copy of this.)
 To: nhfsstone-request@legato.com
 Subject: send unsupported nhfsstone

tcp/ip benchmarks:
ttcp
Archive: sgi.com:/sgi/src/ttcp.shar [192.48.153.1]

(TTCP is the most common TCP benchmark.  Most commonly used as well as most
commonly abused and perverted to generate marketing numbers, and most
commonly enhanced to do things the enhancer considers important.  Look
for the authorative source on brl.mil and more or less true copies many
other places, including sgi.com.)

Rhealstone:
	real time
Lhynestone:
	graphics (TBD, work suspended)

MUSBUS (current version is 5.0 with patches to 5.2):
	uunet: comp.source.unix
	volume11/musbus, patches to 5.2 on volume12/musbus5.2

gbench: (X graphics benchmark)
	uunet: comp.source.unix
	volume 15

tbench:
	uunet: comp.source.misc

plum-benchmarks:
	uunet: comp.source.unix
	volume 20

Byte benchmarks:  (If not below, I have them, also integrated into MUSBUS[?}).
	me.utoronto.ca
	pub/byteunix.tar.Z

Gabriel: (LISP)
	See his book on the subject MIT press. [I'll fill in the entry,
	I have the code.]

SEI Ada benchmarks:
	The Software Engineering Institute of the CMU/DARPA has a set
	of Ada benchmarks.  Contact them.  They may require special
	permission.

GPC (Graphic Performance Committee) Ken ......

RhosettaStone:
	Speech synthesis and recognition benchmark:
		e.g., How to recognize speech.
		How to wreck a nice beach.
	My sister machine: eos.arc.nasa.gov.

COMMERCIALLY AVAILABLE BENCHMARKS (i.e. mucho $$)
The following not only cost some money, but people who pay for them
are not allowed to report results in some cases.  Caveat emptor.
Mention here does NOT constitute an endorsement.

AIM Technology Benchmark (Dronek and beyond)
[Actually some interesting stuff in one or two versions.]

Neal Nelson Business benchmark

Khornerstone (The Lab Report)
IF you are thinking of writing benchmarks, you need a good name.
	ARS/Workstation Laboratories
	4324 N. Beltline, Suite C211
	Irving, TX 75038
	(214)-570-7100 [Fax (214)-570-4201]
$995 at this time.

Seriously, append the year to the end of your name like name90.
It sets a useful point for revision control.  You can learn about
other things, but that's later.

NO SINGLE-FIGURE-OF-MERITs.