[net.astro] astronomical computing

woods@hao.UUCP (Greg Woods) (11/29/83)

  First of all, from my understanding of these two groups, this topic does 
not belong in the expert subgroup, which was intended for technical
discussions *directly* related to astronomy. This is sort of a peripheral
topic (albeit an important one!), and so belongs in the more general group
net.astro . I have posted this article to both groups in an attempt to move
the discussion to where I think it belongs.

  Woods' first law (forgive me if someone has already claimed this) of
scientific computing is, whatever computing power is available will eventually
become saturated, i.e. it is never enough. We have two VAX 11/750's, a PDP
11/70, and a share of two CRAY-1As, and we still manage to just about
saturate everything (well, the second VAX isn't yet ready for general use;
it is being used to bring up 4.2BSD). Of course, we have to share the CRAYs
with the rest of NCAR, including the large 3-D cloud models. We are also
responsible for a good share of the SMM (Solar Maximum Mission) data analysis,
a share which is likely to increase after the scheduled repair of the SMM
satellite by the space shuttle astronauts (I think it is STS-13, but I'm not
positive). [This is one reason we just bought the two new VAXen!]. I'm willing
to bet that any scientific institution reading this would report a similar
situation at their facility.

                	 GREG
-- 
{ucbvax!hplabs | allegra!nbires | decvax!brl-bmd | harpo!seismo | ihnp4!kpno}
       		        !hao!woods