mrk@jax.org (Michael Kosowsky) (03/07/91)
How realistic is it to run X sessions across the Internet? If one proposed to give people access to a resource by making a client available on a machine near a network hub, should one's proposal be dismissed out of hand? For 5 simultaneous users? 100? Are there good rules of thumb for the traffic generated by average text applications (e.g. xterm), and graphics intensive applications? And any heuristics for filling in: "to run N {xterm, xmaze, ...} sessions, my client machine should have X mbytes of RAM and should run at Y mips". Thanks. -- Michael Kosowsky mrk@spretus.jax.org
mouse@lightning.mcrcim.mcgill.EDU (der Mouse) (03/07/91)
> How realistic is it to run X sessions across the Internet? I have a program that I regularly run on a McGill machine connected to a display in Switzerland. The same program is occasionally used running in Rhode Island displaying here at McGill. I sometimes have terminal windows running at various distant places. I find latency more important than bandwidth, unless you're doing something like displaying pictures that involves shovelling large amounts of data around. > If one proposed to give people access to a resource by making a > client available on a machine near a network hub, should one's > proposal be dismissed out of hand? Not for any reason related to what you've said so far. (There may be other reasons for so rejecting it, such as administrative or political reasons on the proposed machine, but that's another issue.) > For 5 simultaneous users? 100? It depends entirely on how many computrons the machine in question has available and possibly also on the characteristics of its network connectivity. (You probably don't want to annoy your regional net by eating up large chunks of its bandwidth. And regardless of how fast the machine is, if it hangs off a 4800bps SLIP line, you won't get tremendous performance! :-) der Mouse old: mcgill-vision!mouse new: mouse@larry.mcrcim.mcgill.edu