jpl@allegra.UUCP (John P. Linderman) (03/27/84)
In the transition from 4.1 to 4.2, Berkeley changed the default line kill character from @ to ^U. I suppose it was done because they are always typing person@place, and they got tired of having the line cleared. Whatever the reason, the change has flat out broken uucp logins at sites (like allegra) that run 4.2. The login sequence in L.sys (aka Systems) files for many sites contain embedded @'s that used to make sure any line noise was wiped away before entering items like password. The @'s have now become a (most unwelcome) part of the password. I'll probably go into /usr/include/sys/ttychars.h, reset the erase, kill and interrupt characters, and install a new /bin/login. That will probably break some other stuff, but that's what happens when conventions are arbitrarily ignored. Other uucp sites converting to 4.2 are hereby warned. John P. Linderman Department of Character Assassination allegra!jpl
jas@drutx.UUCP (ShanklandJA) (03/27/84)
Regarding changing the default line kill character from '@' to ctl-U: it may break one or two things temporarily, but I'd say it's a change that's vastly overdue. Using printing characters for line editing is an anachronism that should go the way of using UNIX on a DECwriter over a 300 baud line. Jim ("Sys V, but 'stty kill ^U' is in my .profile") Shankland ..!ihnp4!druxy!jas
henry@utzoo.UUCP (Henry Spencer) (03/29/84)
Actually, utzoo's default kill character has been control-U for a long time, and we haven't seen any great problems with people talking to us. People who assume that @ kills line noise, without checking its behavior on the specific systems in question, are being foolish. There simply is no standardization in this area, none none none. -- Henry Spencer @ U of Toronto Zoology {allegra,ihnp4,linus,decvax}!utzoo!henry
piet@mcvax.UUCP (Piet Beertema) (03/29/84)
Even on our transatlantic links the noise problem can be circumvented without knowing about any "kill character". Most problems arise during initial modem response. The solution, although it takes a short while, is to first send a fake login/password, e.g. "xyz\n" and then send the real login/password. -- Piet Beertema, CWI, Amsterdam ...{decvax,philabs}!mcvax!piet
honey@down.UUCP (code 101) (03/30/84)
when typing at a low baud rate to a uart listening at a high baud rate, the uart can miss the stop bit(s), causing a framing error, yielding a break. thus, if getty is set up to switch from high baud rates to low ones, it's convenient to type the line kill character a few times to get to the right baud rate. one string of bits that works better than most is 0100, which is, you guessed it, @. there are two lessons here: use getty choices in decreasing order, and give @ special treatment. re-iterating john linderman's original bug report, there's also the fact that for years, sites in the know having been passing out L.sys login scripts containing "" @ and in:-@-in:. who is now responsible for changing the L.sys files on the neighboring sites? peter honeyman
trb@masscomp.UUCP (03/31/84)
Re: Regarding changing the default line kill character from '@' to ctl-U: it may break one or two things temporarily, but I'd say it's a change that's vastly overdue. Using printing characters for line editing is an anachronism that should go the way of using UNIX on a DECwriter over a 300 baud line. Nah, I think that the change is vastly underdue in getty's case. Why? gummo!ber told me about the wonderful grace of giving getty an atsign enema. Obviously, if the remote system is at the right baud rate, the atsign will clear crap (if any) out of getty's input buffer. Less obviously, if getty is at a baud rate that's too fast, the atsign will effect a baud rate change on the remote UNIX. This is true because an atsign is has the binary value 01000000, and if if you send it too slowly, you'll get a cute little one bit with no stop bit. Hence, frame error, which the remote detects as break, and presto, UNIX changes gears. This is why it's a good idea to cycle the speeds in your getty tables from fastest to slowest. ^U is 00010101, which isn't pretty when you're looking for a long string of 0's to look like a break. Anyway, if UCB wanted to be cool, they should have hacked getty/login to accept @, ^X, and ^U for kill and #, ^H, and DEL for erase. I think that changing getty/login to ONLY accept DEL/^U is an extremely losing idea. L.sys entries will be broken all over the universe. (No, I don't use @/#, nor do I use DEL/^U.) Andy Tannenbaum Masscomp Inc Westford MA (617) 692-6200 x274
honey@down.UUCP (code 101) (04/01/84)
from the credit where credit's due dept.: i too learned the magic of @ from cummo!ber. peter
hansen@pegasus.UUCP (Tony L. Hansen) (04/01/84)
An interesting point about System V (as distributed) is that it accepts any of '@', DEL or ^U for erasing the line when you're talking to getty. It only accepts '#' for character erase, however. All of this is done in raw mode. If it sees what looks like a NULL character (coming from typing ^@, CR at a higher line speed, or the break key), it will search for the next line entry and speed in /etc/gettydefs. I haven't checked System V Release 2 to see if it has the same properties. Unfortunately, once you've gotten to the first "Password:" prompt, you are no longer talking to getty, but instead to login, which does things in cooked mode rather than in raw mode, so you're back to the old '@' again. Tony Hansen pegasus!hansen
mark@cbosgd.UUCP (Mark Horton) (04/01/84)
I'm amazed that nobody has looked at the 4.2BSD source or tried out different erase and kill characters. Had you done either, you would have noticed that getty accepts ^U and @ for line kill, and accepts #, DEL, and ^H for erase. getpass is still the same old simple routine, however. Mark
thomson@uthub.UUCP (Brian Thomson) (04/02/84)
On the Framing Errored-ness of Bit Strings, masscomp!trb has this to say: atsign is has the binary value 01000000, and if if you send it too slowly, you'll get a cute little one bit with no stop bit. ^U is 00010101, which isn't pretty when you're looking for a long string of 0's to look like a break. This presents me with a fine opportunity to belabour the uninteresting. You don't need "a long string of 0's" to guarantee a framing error, just a 0 in the correct place. Case 1: transmitter faster than receiver The intercharacter idle line looks like a giant stop bit, so no framing error here. Avoid this case by starting getty at a high speed and stepping down. Case 2: trasmitter 1/2 receiver speed. Number the data bits from right to left (this is the order in which they are transmitted), beginning at 0. Each transmitter bit time will be received as two bits, so the receiver will look for a stop bit in the second half of transmitted data bit 3. Both ^U and @ have a 0 in this position, so both should cause framing errors. Case 3: transmitter 1/4 receiver speed The receiver now interprets the second quarter of data bit 1 as its stop bit. Again, both ^U and @ win here. Case 4: transmitter 1/8 receiver speed A common case, corresponding to transmission at 1200 baud and reception at 9600. Here the stop bit is "seen" in transmitted data bit 0. This is where ^U loses. Case 5: transmitter 1/16 or less of receiver speed Receiver sees only the start bit, hence is guaranteed a framing error. -- Brian Thomson, CSRG Univ. of Toronto {linus,ihnp4,uw-beaver,floyd,utzoo}!utcsrgv!uthub!thomson
ptw@vaxine.UUCP (P. T. Withington) (04/12/84)
If you send @ and return you're basically doing a bad login to a system that only accepts ^U, so you should soon see a clean ogin--ogin... Plus, it was my understanding that 4.2 init/login accept either ^U or @, 'though I haven't bothered to examine the source. ...vaxine!ptw (soon to be ...trope!ptw)