[comp.windows.x] X terminal comparison

db@sunbim.be (Danny Backx) (11/12/90)

I am getting really frustrated with the bad benchmarks that Hank Nussbacher is
posting on mailing lists.

He is spreading a 'router scorecard' on a monthly basis, to several mailing
lists. In this scorecard, he makes the same basic mistakes that he makes in
his recently posted X terminal benchmark. The basic problem is that he does
not want to listen to what the rest of the world thinks.

To be precise, in his 'router scorecard', Hank makes a comparison between
routers from different vendors, but for different targets. He compares Cisco's
AGS+, the high-end model, with Wellfleet's Link Node, the middle model. His
argument is that they have the same physical dimensions and weight. He refuses
to argue on the subject, but keeps publishing new versions of his scorecard.

In the X terminal benchmark he publishes on Xpert, I see several problems :

1. Hank compares the NCD models 16 and 17c to Visual models X19turbo and X19+.
	Nothing is wrong with this, of course, but what he does is compare
	Visual's high-end black&white X terminals to NCD's low end b&w (NCD16)
	terminal. This creates a false impression about NCD's b&w terminals.
	Note that the left-out NCD19 has the same screen dimensions as Visual's
	X19+ or X19turbo. Even their weights are of the same order.

	An objective benchmark would also list the high-end b&w NCD's (NCD19)
	as well as the low-end Visual (X15).

2. I don't know about the software versions that are used on the non-NCD
	equipment. On the NCD's, an old version (2.1) of the X server software
	was used. You can see this in the 2001 server version number.

	The current 2.2 version improves overall performance with 15..30%,
	while the future 2.3 version will go even further.

3. Hank says that on the NCD17c, the font "fixed" was unavailable.
	This is ABSOLUTELY UNTRUE. The font fixed is even permanently there
	because so many X applications have it as their default. This is true
	for every single NCD.

4. Hank doesn't describe the method used for his measurements in detail. This
	would be useful for several things :
		- do the same test on more X terminals, to add to the list
		- verification of the correctness of (a) these results
			and (b) the method used in the benchmark.

5. Hanks benchmark is only a benchmark. Important points in the comparison of
	different X terminals are :
		- ease of installation
		- administrative features
		- ergonomic features
		- connectivity options
		- options for management of large sites
		- ability to run local clients
		- network management (SNMP)

A final point I would like to make is the fact that Hank didn't do the bench-
marks himself. While this is a good attempt to avoid flames, I would suggest
him not to post any more of these magnificent examples of misinformation
on the net.

	Danny Backx
	BIM Networks System Engineer

E-Mail: db@sunbim.be    (or uunet!mcsun!ub4b!sunbim!db)

Telephone: +32(2)759.59.25	Fax : +32(2)759.47.95

Postal Mail :
	Danny Backx
	BIM
	Kwikstraat 4
	3078 Everberg
	Belgium

Relevant parts of Hank's mail :

> The client was a Sun Sparcstation 4.  The network was an isolated
> Ethernet (via a DELNI) with an HP Lanalyzer checking the network.
> 
> 40 separate benchmarks were performed on all six X-terminal stations.
> The results are presented below.  I did not perform the actual
> benchmarks but they have been provided to me by the person who
> did them and who I have worked with.  No further benchmarks are planned.
> 
> I will let you each draw your own conclusions from these benchmarks,
> but one fact is quite obvious: not all X-terminals are equal.
> 
>                                X-terminal benchmarks
>                                    November 1990
>                                ---------------------
> 
> 
> 
>              Vaxstat.  Visual     Visual    NCD16     NCD17c   Tektronix
>              3100      x19turbo   x19PLUS                       XP27
>             +---------+---------+---------+---------+---------+---------+
> Server      |DEC-     |Visual   |Visual   |NCD      |NCD      |Tektronix|
>  vendor     | WINDOWS |         |         |         |         |         |
> Xserver ver.|11.11    |11.3     |11.3     |11.2001  |11.2001  |11.0     |
[...] 
> 1) Draw an image string (font=fixed, height=13).  Results in chars/sec.
>    Note: for NCD17c, font fixed is not available.
[...] 
> 
> Hank Nussbacher
> HANK@VM.TAU.AC.IL
> Israel
> Phone: 972-3-5450887
> Fax: 972-3-416138

de5@ornl.gov (Dave Sill) (11/13/90)

[note: followups directed to comp.benchmarks]

In article <9011121342.AA13630@sunbim.be>, db@sunbim.be (Danny Backx) writes:
>
>	An objective benchmark would also list the high-end b&w NCD's (NCD19)
>	as well as the low-end Visual (X15).

So unless he has one of everything he can't post any results?  That hardly
seems necessary.

>	The current 2.2 version improves overall performance with 15..30%,
>	while the future 2.3 version will go even further.

He identified which versions were tested.

>4. Hank doesn't describe the method used for his measurements in detail. This
>	would be useful for several things :
>		- do the same test on more X terminals, to add to the list
>		- verification of the correctness of (a) these results
>			and (b) the method used in the benchmark.

I agree it would be good to make this available.  I don't know if it's
necessary to include it in every report posted, if its availability is
mentioned.

>5. Hanks benchmark is only a benchmark. Important points in the comparison of
>	different X terminals are :
>		- ease of installation
>		- administrative features
>		- ergonomic features
>		- connectivity options
>		- options for management of large sites
>		- ability to run local clients
>		- network management (SNMP)

No kidding.  Benchmarks is benchmarks.  The existence of fools who would use
benchmarks results as the only criterion for selecting a system should not
preclude the distribution of the information to those who apply a more
thorough selection process.

>A final point I would like to make is the fact that Hank didn't do the bench-
>marks himself. While this is a good attempt to avoid flames, I would suggest
>him not to post any more of these magnificent examples of misinformation
>on the net.

I encourage Hank to continue posting his results, and I hope people like you
will continue to analyze his postings for validity and consistency.

-- 
Dave Sill (de5@ornl.gov)
Martin Marietta Energy Systems
Workstation Support

prc@erbe.se (Robert Claeson) (11/14/90)

In a recent article db@sunbim.be (Danny Backx) writes:

>1. Hank compares the NCD models 16 and 17c to Visual models X19turbo and X19+.
>	Nothing is wrong with this, of course, but what he does is compare
>	Visual's high-end black&white X terminals to NCD's low end b&w (NCD16)
>	terminal. This creates a false impression about NCD's b&w terminals.
>	Note that the left-out NCD19 has the same screen dimensions as Visual's
>	X19+ or X19turbo. Even their weights are of the same order.
>
>	An objective benchmark would also list the high-end b&w NCD's (NCD19)
>	as well as the low-end Visual (X15).

Ahem. The X-19 Plus from Visual Technology *is* a low-end terminal, to be
compared with NCD's NCD19b. Visual's two high-end terminals are X-15 Turbo
and X-19 Turbo, and the low-end terminals are X-15 and X-19 Plus (there's
also an X-19, which has been superceeded by the X-19 Plus. The old X-19 has
the same screen resolution as the NCD19b). Visual also has a low-low end
terminal named X-14/ES, while NCD's sub-low end terminal is named NCD15 or
possibly NCD15b (the correct name escapes me now). So there.

>2. I don't know about the software versions that are used on the non-NCD
>	equipment. On the NCD's, an old version (2.1) of the X server software
>	was used. You can see this in the 2001 server version number.
>
>	The current 2.2 version improves overall performance with 15..30%,
>	while the future 2.3 version will go even further.

To be fair, the Visual terminals were also tested with an old version of
the software. The current release, 3.0, is based on X11R4, and improves
the performance twenty-fold in some cases.

-- 
Robert Claeson                  |Reasonable mailers: rclaeson@erbe.se
ERBE DATA AB                    |      Dumb mailers: rclaeson%erbe.se@sunet.se
Jakobsberg, Sweden              |  Perverse mailers: rclaeson%erbe.se@encore.com
Any opinions expressed herein definitely belongs to me and not to my employer.