[comp.benchmarks] SPEC results

de5@ornl.gov (Dave Sill) (12/15/90)

Disclaimer: This information is provided as a public service.  It is
unofficial and incomplete, and may even be wrong.  For complete SPEC
benchmark results, contact: 

        Systems Performance Evaluation Cooperative (SPEC)
        c/o Waterside Associates
        39510 Paseo Padre Parkway
        Suite 350
        Fremont, CA  94538
        415/792-3334

        (After January 1, 1991)
        SPEC
        c/o Waterside Associates
        39138 Fremont Blvd.
        Fremont, CA  94538
        415/792-3334

Machine           GCC   Esp  Spice  Dod  Nasa7   Li  Eqn  Mat Fppp  Tom    Mean
-------           ---   ---  -----  ---  -----   --  ---  --- ----  ---    ----
DECstation 5000    83   122   1747  103    889  284   60  266  138  153     209
SPARCserver 490    69   138   1458  133   1012  318   63  201  162  215     220
Harris Night Hawk  65   108   1782  156   1240  313   65  211  163  182     223
SPARCstation 330  108   196   2096  196   1439  553   88  308  232  324     327
DECsystem 5400    135   161   2633  146   1569  510   81  434  227  268     328
DECsystem 5810    153   143   2876  128   2047  529   73  545  216  291     342
DECstation 3100   136   189   2523  165   1525  473   98  460  243  268     343
DECstation 2100   196   254   3418  224   2068  641  125  627  340  363     467
SPARCstation 1    139   254   2915  369   1968  690  113  410  389  445     459
Sun 4/260         150   292   5017  551   5878  683  133  485 1030  628     678
VAX 11/780       1482  2266  23951 1863  20093 6206 1101 4525 3038 2649    3868

-- 
Dave Sill (de5@ornl.gov)	  It will be a great day when our schools have
Martin Marietta Energy Systems    all the money they need and the Air Force
Workstation Support               has to hold a bake sale to buy a new bomber.

de5@ornl.gov (Dave Sill) (01/19/91)

Here's the latest compilation of SPEC results.  Send additions,
corrections, and comments to me at de5@ornl.gov.

How does one compute a geometric mean?  How *exactly* is the SPECmark
computed?  This table shows elapsed time in seconds for each of the
ten programs.  Would SPECratios be preferable?  E.g., which is better:
the 83 second GCC figure, or the 17.3 GCC SPECratio?  I'm not sure how
easy it would be to fit the ratios in 80 columns.

Disclaimer: This information is provided as a public service.  It is
unofficial, incomplete, and may even have a typo or two.  For complete
SPEC benchmark results, contact:

        Systems Performance Evaluation Cooperative (SPEC)
        c/o Waterside Associates
        39138 Fremont Blvd.
        Fremont, CA  94538
        415/792-3334

Machine           GCC   Esp  Spice  Dod  Nasa7   Li  Eqn  Mat Fppp  Tom  Notes
-------           ---   ---  -----  ---  -----   --  ---  --- ----  ---  -----
Alacron AL860     100   105   1323  109    351  286   60  161  117   61  5
Apollo DN10010    123   176   3093   81    985  926  141  208   95  133  1
CDC 4340           97   131   1761  118   1110  306   64  333  174  187  7
CDC 4360           77   126   1774  109   1092  260   60  340  152  157  7
CDC 4380           79   125   1774  107   1092  260   60  338  151  155  7
CDC 4680 BETA      33    52    635   49    507  138   31   86   58   72  7
CDC 910B-621       82   136   2395  127   1395  381   68  466  189  245  7
CDC 920B-450      111   163   2260  168   1595  400   83  457  249  248  7
DEC 6000/410      291   349   3863  270   2450  862  164  696  405  358  7
DEC DS2100        196   254   3372  278   2230  646  125 1002  476  352  1
DEC DS2100        196   254   3418  224   2068  641  125  627  340  363  4,7
DEC DS3100        136   189   2523  165   1525  473   98  460  243  268  4,7
DEC DS3100        145   194   2500  208   1646  480   99  749  292  260  1
DEC DS5000/200     83   122   1747  103    889  284   60  266  138  153  4,7
DEC DS5400        135   161   2633  146   1569  510   81  434  227  268  4,7
DEC DS5810        153   143   2876  128   2047  529   73  545  216  291  4
DG AV310          150   173   2886  270   2161  460  105  415  366  319  7
DG AV410          154   177   2921  278   2258  481  107  439  380  331  7
DG AV5010         136   170   2753  262   2115  450  103  408  349  298  7
DG AV6200         111   133   2177  203   1688  355   81  321  279  241  7
HP 9000/340       478   985  13306 1694  16744 1881  500 4525 3038 2943  7
HP 9000/370       274   472   7044  582   5581  955  250 1616  921  757  7
HP 9000/834       145   255   2753  214   2208  530  109  431  338  312  7
HP DN10010        116   176   2013   81    980  559   99  206   95  133  7
HP9000/340        485   982  13360 1756  16617 1910  505 4625 3090 2839  1
HP9000/370        273   471   7134  590   5576  954  248 1625  923  766  1
HP9000/834        146   256   2761  214   2218  530  109  432  339  311  1
Harris Night Hawk  65   108   1782  156   1240  313   65  211  163  182  6
IBM RS/6000-320   108   139   1210   86    755  398   62  259   71   47  7
IBM RS/6000-520   109   139   1210   87    753  398   61  259   71   47  7
IBM RS/6000-530    83   109    868   67    566  313   48  208   56   35  7
IBM RS/6000-540    71    91    721   56    463  262   41  171   46   29  2,7
IBM RS/6000-730    85   109    868   67    549  313   49  202   56   35  7
IBM RS/6000-930    85   109    868   67    566  313   48  208   56   35  7
Intel 486         107   186   2722  321   3464  369  100  476  434  616  7
Intel STAR860     120   113   1618  119    447  351   62  210  139   74  5
MIPS 6280 Beta     33    52    635   49    507  138   31   86   58   72  2,7
MIPS M/120-5      119   185   2424  170   1571  404   92  764  250  238  1,7
MIPS M/2000        77   125   1997  106   1198  263   62  541  151  158  1
MIPS M/2000        78   125   1979  106   1092  261   60  340  149  150  7
MIPS RC2030       169   198   2720  190   1679  448   98  793  644  248  1
MIPS RC3240        96   128   1979  117   1110  304   64  328  171  191  7
MIPS RC3260        80   126   2013  108   1104  266   62  345  152  153  7
MIPS Rx2030       172   192   2661  188   1511  437   96  471  620  241  7
MIPS Rx3230        81   126   1687  110    957  275   59  294  192  143  7
Moto 8608         193   197   3350  295   3187  458  129  520  488  509  1,7
Moto 8612          81    99   1618  153   1148  260   53  210  199  178  7
Moto 8864DP       138   205   3609  308   3508  478  135  597  537  540  1
Moto 8864SP        85   117   1916  184   1322  300   69  246  207  228  7
Moto 8864SP       106   146   2395  230   1647  376   86  308  260  285  7
Moto 8864SP       120   195   3279  290   3172  448  133  547  504  487  1,7
Moto MPC-100 est  106   146   2395  230   1647  376   86  308  260  285  7
Moto MPC-200 est   85   117   1916  184   1322  300   69  246  207  228  7
SGI 4D/210S        82   136   2395  127   1395  381   68  466  189  245  7
SGI 4D/25S        111   163   2260  168   1595  400   83  457  249  248  7
SGI 4D/320S        69    96   1565   94   1104  243   55  340  123  161  5
Solbourne 5/801    76   139   1607  161   1248  341   65  200  170  224  7
Stardent 3010      83   112   1629   95    319  343   60   42  104   43  2,7
Sun 4/260         150   292   5017  551   5878  683  133  485 1030  628  3
Sun SS SLC est    139   255   3373  433   2283  690  114  476  453  509  7
Sun SS1           138   254   2875  374   2308  689  113  409  387  470  1
Sun SS1           139   254   2915  369   1968  690  113  410  389  445  3,7
Sun SS1+          117   206   2395  305   1661  653   92  354  320  373  7
Sun SS330         107   196   2152  225   1800  552   88  315  323  352  1
Sun SS330         108   196   2096  196   1439  553   88  308  232  324  3,7
Sun SS490          69   138   1458  133   1012  318   63  201  162  215  3,7
VAX 11/780 ref   1482  2266  23951 1863  20093 6206 1101 4525 3038 2649  1

Notes:
    1. SPEC Newsletter, Volume 1, Issue 1, Fall 1989
    2. 39283@mips.mips.COM (mash@mips.COM (John Mashey))
    3. SPEC Newsletter, Volume 2, Issue 1, Winter 1990
    4. "Digital's RISC Family Graphics and CPU Performance Summary"
    5. "Supercomputing Review", September, 1990
    6. tom@hcx2.ssd.csd.harris.com (Tom Horsley)
    7. Calculated from "Your Mileage May Vary", John R. Mashey

--
Dave Sill (de5@ornl.gov)	  It will be a great day when our schools have
Martin Marietta Energy Systems    all the money they need and the Air Force
Workstation Support               has to hold a bake sale to buy a new bomber.

mccalpin@perelandra.cms.udel.edu (John D. McCalpin) (01/24/91)

> On 20 Jan 91 00:29:51 GMT, mccalpin@perelandra.cms.udel.edu I said:

>>> On 18 Jan 91 19:11:35 GMT, de5@ornl.gov (Dave Sill) said:

Dave> Here's the latest compilation of SPEC results.  Send additions,
Dave> corrections, and comments to me at de5@ornl.gov.

me> For those interested, I have placed the information that Dave posted
me> in an 'sc' spreadsheet file.  This is an ascii file in 'sc' internal
me> format for reading into a spreadsheet, *not* a version for printing.

I have updated the file to include some interesting calculations.
The original raw data has been hidden (using the sc 'z' command) so
that only the interesting stuff is shown by default.   You can recover
the original data using the 's' command of sc.

The current table has the following columns:
	(1) machine name
	(2) total SPECmark
	(3) integer "SPECmark" (*)
	(4) floating-point "SPECmark" (*)
	(5) ratio: FP SPEC/Integer SPEC
	(6) ratio: Best SPEC/Worst SPEC

(*) Please note that these are entirely *unofficial* results and
categories and are intended only as an easy reference.

The "Integer SPECmark" is the geometric mean of the SPEC ratios for
the four integer tests: gcc, espresso, li, eqntott.
The "Floating-point SPECmark" is the geometric mean of the other 6
tests. 

The file is available by anonymous ftp from perelandra.cms.udel.edu
in pub/bench/spec.sc.

Please send updates and corrections to me and I will keep the file
up-to-date, though if I get over 100 machines to list I will have to
either restructure the calculations or build a bigger version of sc!

Any ideas for other interesting calculated quantities to stick into
the table are welcome.....
--
John D. McCalpin			mccalpin@perelandra.cms.udel.edu
Assistant Professor			mccalpin@brahms.udel.edu
College of Marine Studies, U. Del.	J.MCCALPIN/OMNET

era@era.scd.ucar.edu (Ed Arnold) (02/02/91)

In article <1991Jan18.191135.1135@cs.utk.edu> de5@ornl.gov writes:
>Here's the latest compilation of SPEC results.  Send additions,
>corrections, and comments to me at de5@ornl.gov.

Thanks for posting this info.  Is this something you update on a
regular basis?

>Disclaimer: This information is provided as a public service.  It is
>unofficial, incomplete, and may even have a typo or two.  For complete
>SPEC benchmark results, contact:
>        Systems Performance Evaluation Cooperative (SPEC)
>        c/o Waterside Associates
>        39138 Fremont Blvd.
>        Fremont, CA  94538
>        415/792-3334

Unfortunately, it appears that SPEC wants to restrict the availability
of info from them, because the price has significantly increased.  Last
year, the newsletter was $150, and the newsletter+tape $450.  This year,
it's $399 and $699.  That 166% price increase still gets you just four
issues.
--
era@ncar.ucar.edu