thorinn@diku.UUCP (Lars Henrik Mathiesen) (02/08/86)
In article <484@hoptoad.uucp> laura@hoptoad.UUCP writes: >In article <11689@ucbvax.BERKELEY.EDU> weemba@brahms.UUCP (Matthew P. Wiener) writes: >> (1) a == (a/b) * b + a%b >> (2) (-a)/b == -(a/b) >> (3) (a+b)/b == a/b + 1 >> (4) (a+b)%b == a%b >While it is true that number theorists want 3 and 4, it is not the naive user >who will be fooled by 2. It is the naive *mathematician*, which is just about >everybody. To non-mathematicians, 2 is a law, with about the same force as >the law of gravity, and not something that you can redefine. I think that the only people in a position to be fooled by this are programmers who don't have strong mathematical backgrounds. The reason: In `normal' algebra you can operate to your heart's content on fractions and so on, BUT you never round or truncate. In arithmetic, on the other hand, you learn to do division with remainder, BUT not for negative operands. (This may differ in America, I don't know.) The point is that algebra has an assumption of RATIONAL numbers, which makes ALL of (1) to (3) true. But because programmers think they know how to do integer division, they forget this assumption. That's why the `naive' user thinks that BOTH (2) and (3) should be valid! ABOLISH INTEGER DIVISION !!!! -- Lars Mathiesen, DIKU, U. of Copenhagen, Denmark ..mcvax!diku!thorinn