scs@adam.mit.edu (Steve Summit) (04/07/91)
I have the usual set of routines for installing an interrupt handler to allow reliable, interrupt-driven serial port input. These routines have been working fine, for several years, on all manner of AT's (286 and 386). Recently, however, I've discovered that they don't work at all well on an older XT. The failure mode makes it act a lot like non-interrupt-driven input, with incredibly bad latency. For instance, when sending characters out over the serial port to a device which echoes them (the usual case), there is a delay of up to a second or two between the transmission of a character and the firing up of the interrupt routine to read the echoed character. (Curiously, the length of the delay depends on the time since the previous character: if no characters have been sent or received for several seconds, a character can be sent, echoed, and received almost immediately, while a character sent just after a received character encounters the maximum delay.) If several characters come in in rapid succession, only one interrupt occurs (for the last one), and only the last one is correctly received. I can't imagine what could be causing a delay (and a long one, at that) between the appearance of a character at the serial port and the firing of the receiver interrupt. (I know the delay isn't being introduced in the remote device, which echoes immediately and works well when driven by other machines.) (I have verified the send-echo-receive timing with a debugging printout just after the OUTP instruction which sends a character, an RS-232 breakout box, and another printout -- using a low-level console output routine, for safety -- at the beginning of the interrupt routine. One of the difficulties, of course, in debugging interrupt routines is that you can't use interactive debuggers, if for no other reason than that the timing is disturbed by single-stepping and waiting at breakpoints.) I would suspect hardware problems (either in the serial card, or the XT's interrupt logic) except that Kermit (which I assume uses interrupt-driven I/O) runs fine on this machine, and one time (out of hundreds!) my program ran correctly, too. There are no TSR's running on the machine, nor anything else "funny" started out of CONFIG.SYS or AUTOEXEC.BAT . I don't believe any other interrupt routines are competing for the serial port interrupt vector, but I have yet to verify this. The machine is a "true blue" IBM XT, although it might have a nonstandard serial port card. (It's not my machine, so I can't say exactly what the configuration is.) It's running DOS version 3.3 . I suspect that there are (subtle or otherwise) differences in serial port or interrupt logic between the XT and the AT on which this code was developed. Is anyone aware of any such differences, or of any other reason why serial port interrupts could behave this way on an XT? (I am not going to present the code, as it is far too long and involved. I'm just asking for your hunches or general insights.) I'm not sure this is a general-interest topic, so I'd suggest you reply by mail. Steve Summit scs@adam.mit.edu P.S. You don't need to tell me about all of the available, debugged, interrupt-driven serial port libraries out there, one of which I should be, and wish I were, using. There weren't as many of them back when this code was first written, and by now my calling programs of course depend on several of its features. I may yet convert to a more standard serial port library, but I'm still curious as to what's going on, and what I might be doing wrong.