dgh@sun.UUCP (03/08/86)
> > While we're on the topic, and you probably mentioned this on the net a few > months ago, and probably a few years ago before that, has any machine ever > implemented interval arithmetic `a la Knuth: all floating point numbers > represented by a pair (lowest possible, highest possible)? Actually I don't recall seeing interval arithmetic mentioned here recently. There have been software implementations at various places, particularly at Karlsruhe by Nickel and colleagues, and particularly during the great Fortran preprocessor bull market of the 1970's. I thought it was probably a good idea until IBM embraced something similar in its ACRITH package, which is available microcoded on some newer mainframe models. ACRITH mixes together some good and not-so-good ideas with too much marketing hype. Anyway, too touch on another hot topic - RISC vs CISC - interval arithmetic instructions definitely come under the category of CISC. Any machine implementing ALL of the IEEE standard has the rounding modes needed to microcode interval arithmetic instructions (or even to NOT microcode them on a RISC). If you want to know more about interval methods of computational error analysis, there are several books by Ramon Moore and Karl Nickel; if you are a glutton for punishment there are some heavy tomes from Kulisch and Miranker which define an (over)formalization of ACRITH.