[net.micro.atari] OBSCURE BUG--need help!

reid@uwvax.ARPA (03/14/84)

I am tearing my hair out!  When trying to run my modem at 1200 baud, I start
to lose characters every so often, very predictably and non-noise-related.
I have written a terminal emulator in action, which works fine, except for 
this bug.  It always follows an EOL character being Put() to the screen.  
The first character after the EOL keeps getting lost somewhere, and gets
chopped off on the screen.  It is not in the buffer, since the buffer never
gets more than two characters in it before I read them out.  No overflow there,
at least.  It has nothing to do with the host, since even my modem echoing
commands to me exhibits the same property.  It starts out fine, then after
about 20 lines it starts to act up.  Almost like bits are being lost somewhere.
The thing is, when I hit system reset, it goes back to normal again.  That is,
the cycle starts over one more time.  This doesn't happen at all at 300 baud,
and the crazy thing is, that if I put ANY CHARACTER besides EOL in the same
spot (like a slash, for experimentation's sake), then no bytes get lost.  Of
course, all the output is on the same line, too, so it isn't ideal.  I've
even tried to simulate EOL by messing around with ROWCRS and COLCRS, but then
I have to scroll, and for some reason after I scroll it up I can't use the
ROWCRS any more, or it's out of range or something.  

ANY IDEAS?  I can clarify specific details if you think you have some idea
what is going on.   JACK PALEVICH--I have been looking carefully at your
Kermit program--you do things pretty much the same--have you encountered this
strange behavior????   H E L P !!!

Glenn Reid  ...seismo!uwvax!reid