gt4586c@prism.gatech.EDU (WILLETT,THOMAS CARTER) (10/12/90)
I have been writing a number crunching program, and in the process I have come across a most perplexing problem. I am using Think Pascal 3.0. I am reading a data file into a buffer, and then reading off chunks of the buffer to get the numbers. My routine works fine when I have the compile options set to generate '020/030 machine code and making calls to the FPU, with ELEMS881 = TRUE. However, when I change it to generate code for the SE, it goes whacky. The data gets read into my arrays, and when I plot the data it has the correct shape, but the magnitudes are reduced by about 65536, or 2 to the 16th power. This suggests that for some reason the first 16 bits of each number i try to read off are being zeroed out, effectively bit- shifting-right sixteen places. I am trying to read extended precision (80 bit) numbers out of the file buffer. I find it truly bizarre that it works just fine with the MacII version but bombs with the SE version. No code is changed - only the compile options. Does anybody have any idea what might be going wrong? Thanks. -- thomas willett Georgia Institute of Technology, Atlanta gt4586c@prism.gatech.edu "Violence is the last refuge of the incompetent." - Salvor Hardin (Foundation)