eric@snark.uu.net (Eric S. Raymond) (12/03/89)
In <1933@eric.mpr.ca> Michael Hui wrote: > GaAs cannot be a dead end. It is used quite a bit in microwave circuits, > and now you can get industry standard pin-out and function PALs from > Gazelle in GaAs. Of course GaAs is useful for microwave and DSP -- what I don't believe is that it will ever become a mainstream commodity process like CMOS or NMOS or even ECL. There will always be *some* customers willing to buy such technology, generally DOD or other organizations with weak cost-control. Perhaps I didn't make my reasoning clear enough. Given the usual development timescale of digital electronics, I claim that if GaAs were really such a smart idea it would *already* be rich. GaAs is still poking around in niche markets umpteen years after the pioneers, without ever having entered a regime of exponential capacity growth and inverse-exponential price drop. In this industry, technologies with such a profile are usually losers based on a fundamental misreading either of engineering possibility or economics. Niche markets and the enthusiasm of a few can keep them breathing stertorously but they tend to end up having zero impact on the development of computing as a whole. Lisp machines. Forth machines. `Hybrid computing'. Thin-film memory. In software, APL...PL/1...Algol 68. Computing history is littered with the corpses of these perpetually promising youngsters. Today, we have ADA and OS/2. And tomorrow, I am nearly sure `commodity GaAs' will join this dismal list. -- Eric S. Raymond = eric@snark.uu.net (mad mastermind of TMN-Netnews)
henry@utzoo.uucp (Henry Spencer) (12/03/89)
In article <1TfOZ0#142gXX=eric@snark.uu.net> eric@snark.uu.net (Eric S. Raymond) writes: >GaAs is still poking around in niche markets umpteen years after the pioneers, >without ever having entered a regime of exponential capacity growth and >inverse-exponential price drop. The basic problem with GaAs is the killer-micro syndrome again: it's not that GaAs is all that bad, but that it is competing with silicon. In the time it took you to read the previous sentence, the world's semiconductor industries spent several thousand dollars on improving silicon-based technology. Given that silicon is pretty good stuff, and seems to be nowhere near any important fundamental limits, competing with this juggernaut is almost impossibly difficult. Competing technologies have to be a *lot* better to make any headway at all. GaAs just does not seem to be sufficiently better to capture anything more than niche markets. -- Mars can wait: we've barely | Henry Spencer at U of Toronto Zoology started exploring the Moon. | uunet!attcan!utzoo!henry henry@zoo.toronto.edu
cmt@myrias.com (Chris Thomson) (12/10/89)
In article <1TfOZ0#142gXX=eric@snark.uu.net> eric@snark.uu.net (Eric S. Raymond) writes: >In this industry, technologies with such a profile are usually losers based >on a fundamental misreading either of engineering possibility or economics. >Niche markets and the enthusiasm of a few can keep them breathing stertorously >but they tend to end up having zero impact on the development of computing as >a whole. > >Lisp machines. Forth machines. `Hybrid computing'. Thin-film memory. In >software, APL...PL/1...Algol 68. Computing history is littered with the >corpses of these perpetually promising youngsters. Today, we have ADA and OS/2. >And tomorrow, I am nearly sure `commodity GaAs' will join this dismal list. APL, PL/I and Algol68 had considerably more than zero impact. The impact of Algol68 in particular has been far reaching, even though the language itself is not used much anymore. Algol-W, Pascal, Modula and Ada all owe their heritage directly to Algol68. C was influenced heavily by Algol68. PL/I (may it fade away quietly) also contributed valuable lessons in language design. APL features keep popping up, for instance as vector-valued subscripts in F8X. -- Chris Thomson, Myrias Research Corporation uunet!myrias!cmt or cmt@myrias.com 900 10611 98 Ave, Edmonton Alberta, Canada Tel 403-428-1616 Fax 403-421-8979
eric@snark.uu.net (Eric S. Raymond) (12/12/89)
In <629274691.11005@myrias.com> Chris Thomson wrote: > Algol68 in particular has been far reaching, even though the language itself > is not used much anymore. Algol-W, Pascal, Modula and Ada all owe their > heritage directly to Algol68. C was influenced heavily by Algol68. Sorry, but this is quite wrong. You're thinking of Algol-60. > PL/1 (may it fade away quietly) also contributed valuable lessons in language > design. Yeah. All negative, and all still unlearned by many of those who should know better -- witness ADA. I wish PL/1's impact *had* been nonzero... > APL features keep popping up, for instance as vector-valued > subscripts in F8X. Sorry, you'll have to do better than this. PPL and even some versions of BASIC had array-valued subscripting before APL. This is wandering out of comp.arch's demesne. Followups to comp.lang.misc please. -- Eric S. Raymond = eric@snark.uu.net (mad mastermind of TMN-Netnews)
gerry@zds-ux.UUCP (Gerry Gleason) (12/21/89)
In article <629274691.11005@myrias.com> cmt@myrias.com (Chris Thomson) writes: >. . . APL features keep popping up, for instance as vector-valued >subscripts in F8X. The best use I've seen for APL was in a book on ALU architecture. They used APL algorithms to describe hardware, with vector operations mapping to arrays of hardware elements and control/data flow mapping to interconnections. The APL descriptions were at least as clear as block diagrams would be. Also, your comments on Algol (Algol60 is probably the right reference) are right on. It's the grandaddy of all block structured languages. Gerry Gleason
johnl@esegue.segue.boston.ma.us (John R. Levine) (12/21/89)
In article <53@zds-ux.UUCP> gerry@zds-ux.UUCP (Gerry Gleason) writes: >The best use I've seen for APL was in a book on ALU architecture. They >used APL algorithms to describe hardware, ... The first place many of us saw APL was in the IBM Systems Journal in 1964. There was a description of the architecture of the new System/360 written entirely in an early version of APL. It was quite complete, even to such things as the emergency pull switch. It was the same issue with Brezenham's classic article about how to draw a straight line on a raster device, a pen plotter attached to a 1620. There is some extra cruft in the presentation of the method because he wanted to avoid an integer division by two which was very slow. -- John R. Levine, Segue Software, POB 349, Cambridge MA 02238, +1 617 864 9650 johnl@esegue.segue.boston.ma.us, {ima|lotus|spdcc}!esegue!johnl "Now, we are all jelly doughnuts."
pcg@aber-cs.UUCP (Piercarlo Grandi) (12/22/89)
In article <1989Dec21.013530.2455@esegue.segue.boston.ma.us> johnl@esegue.segue.boston.ma.us (John R. Levine) writes:
It was the same issue with Brezenham's classic article about how to draw a
straight line on a raster device, a pen plotter attached to a 1620. There
is some extra cruft in the presentation of the method because he wanted to
avoid an integer division by two which was very slow.
A thought that surely has architectural implications: there is another
algorithm to rasterize lines, that is based on algebra and grammars. The idea
is that the line to draw is really made up of one section repeated over and
over, you just calculate this section and then copy it again and again. It
does not do this by making (albeit simple) decisions at every point to draw.
The algorithm was published, apparently around the same time as Brezenham's,
by a Belgian mathematician.
Guess what, it has never bloomed. Very few graphics people even know about
it, as it was a nice result of algebraic theory. Will it bloom? It is easy
to see that (especially important with the increasing resolution of modern
devices) it can be much faster than Bresenham's.
--
Piercarlo "Peter" Grandi | ARPA: pcg%cs.aber.ac.uk@nsfnet-relay.ac.uk
Dept of CS, UCW Aberystwyth | UUCP: ...!mcvax!ukc!aber-cs!pcg
Penglais, Aberystwyth SY23 3BZ, UK | INET: pcg@cs.aber.ac.uk