caffin@decus.com.au (Roger) (07/10/89)
Re: PDP-11 MOVB instruction Standard PDP-11 interface design is such that the decoding of the bus signals will generate either a BYTE write OR a WORD write signal. There is a specific signal for this: BWTBT. "It is asserted during BDOUT (in a DATOB bus cycle) for byte addressing." (DEC specifications.) The best place to see this is in the DRV-11 circuit diagrams, or in the DCK chip specifications. If you do it any other way, that's your decision, and NOT in accord with the PDP-11 architecture. The suggestion that most cases would "sign-extend" is not correct. In fact, apart from the MOVB #-1,R0 case (register destination), I am unaware of any cases where this has been seen. At one stage, DEC had an application note giving detailed information on how to use the DCK set: if you are interested your sales rep might be able to dig up a copy. Well, he's got to be useful for SOMETHING!? PS: Note that the DRV11 is NOT compatible with the Q-bus specifications is implemented on the uVAX II. The DCK chip set is compatible however, as is the DRV11-J quad version. There is a small mod you can do to the DRV11 to make it work, but it is a bit non-standard! Alternately, you can just put the DRV11 in, wait for a bus crash/bug check due to failure to conform to Q-bus specifications, and then tell the system to continue! This leaves the bus-error flip-flop set, but the bus interface controller is edge sensitive and fails to see this anymore .... You won't detect any other bus failures either, which could be a problem ..... Needless to say, DEC don't want to know about such devious mistreatment of the system, and I certainly don't accept any responsibility either! My thanks to Eric Piip of DEC Australia for the technical information which sorted this gremlin out for me. The info Is available on the internal network, somewhere!
bcw@rti.UUCP (Bruce Wright) (07/15/89)
In article <396@decus.com.au>, caffin@decus.com.au (Roger) writes: > Re: PDP-11 MOVB instruction > > Standard PDP-11 interface design is such that the decoding of the bus signals > will generate either a BYTE write OR a WORD write signal. There is a specific > signal for this: BWTBT. "It is asserted during BDOUT (in a DATOB bus cycle) > for byte addressing." (DEC specifications.) The best place to see this is in > the DRV-11 circuit diagrams, or in the DCK chip specifications. If you do it > any other way, that's your decision, and NOT in accord with the PDP-11 > architecture. A long time ago in a country far, far away there was a computer manufactured with a bus known as a UNIBUS. Because computer technology there was then in such a primitive state, numerous corners were cut (even by the manu- facturer of said computer). Seriously, the problem that causes this botch is only likely to be found on a UNIBUS machine. Such a bus does not look like the more modern Q-BUS. As I had mentioned in my previous posting, it is UNLIKELY that you will encounter the problem in hardware produced post-1980, but there is still a surprising amount of 1970's PDP-11 hardware floating around. Yes, even DEC hardware produced then often did not meet the "official" specs. It's pretty amazingly primitive by modern standards. (DCK chip? You mean that the communication with the bus can be done in ONE chip instead of the double handfull we're using now?? Must have several dozen gates at least! :-) :-) :-) (BTW, a DRV11 is a Q-BUS device ONLY. The analogous UNIBUS device is a DR11 -- many varieties existed. I don't know offhand if any of them had this problem). For what it's worth, this was known around 1976-1977 at least in the PDP-11 systems community. I have *seen* this problem more than once and know at least one person who was asked about this subject when he interviewed with DEC during that time frame (he gave the correct answer that some devices would not handle the byte reference properly and that the result turned out to be a sign extend on the processors then in use - his interviewer made the comment that not everyone seemed to be aware of this). I can certainly understand that this whole topic is something of a historical curio -- it isn't a problem with modern hardware and hasn't been for quite a while. But as I mentioned in the previous article, if you produce commercial software you can't necessarily know what kind of ancient hardware will be subjected to your program, and in that case it's better to be safe than sorry. I'd recommend writing words rather than bytes to any type of device which had some version that existed during the 1970's (especially the early 1970's). Bruce C. Wright