beede@hubcap.UUCP (03/09/87)
While trying to get TeX running on a Celerity UXIX system, I encountered some difficulty reading binary files. Apparently Celerity Pascal is _almost_ identical to Berkley Pascal, down to sharing the same structure for file descriptors. However, the Berkley version wouldn't work when reading binary files. To ensure portability between machines, TeX handles binary files a byte at a time (thus ensuring that there is no big/little-endian conflict between machines. The Berkley version declared a file as ``packed array of -128..127,'' then reads it using assignments from the file pointer, in a loop (simplified) like while not eof(f) do begin array[i] := f^; get(f); i := i+1 end Some values were lost, and I had no luck fixing the matter, so I wrote an external C routine to read a byte and return it. I changed the (simplified) loop to while not eof(f) do begin array[i] := newread(f); i := i+1 end which STILL lost some values (always 255's--these are really supposed to be unsigned bytes. The old version tested and added 256 on negative values). The final problem proved to be that eof(f), when the next byte was 255, returned false after MOVING THE FILE POINTER. i.e., one byte was lost. My question: is this expected behavior, or am I correct in assuming that someone cheaped out on the code generation and used some character- oriented routine to read from anything that looked like a file of bytes? Thanks. -- Mike Beede UUCP: . . . ! gatech!hubcap!beede Computer Science Dept. Clemson University Clemson SC 29631-1906