sean@cadre.dsl.PITTSBURGH.EDU (Sean McLinden) (11/06/88)
Now that the crime of the century has been solved and all of the bows have been taken it is, perhaps, time to reflect a little more on the implications of what has happened. First of all, to the nature of the problem. It has been suggested that this was little more than a prank let loose without sufficient restraint. I have not seen the latest in press releases but there seems to be a hint of "I didn't want anything like this to happen!" Perhaps not. In fact, if the thing had not run wild and had not bogged down a number of systems it might have gone undetected for a long time and might have done much worse damage than our estimates suggest was done. I can accept that the author did not anticipate the virulence of his creation but not that it was out of some benevolent concern for the users of other systems. Rather it was because it allowed him to be caught. In fact, with function names such as "des", "checkother", and "cracksome", I am less likely to believe that the intent of this program was one of simple impishness. Let's look, for a moment, at the effects of this system (whether intended or otherwise). First, it satisfied a public desire for news and, one might argue, served as a reassurance to the many technophobes out there that our systems are as vulnerable as error prone as they, all along, have been arguing. If you don't think that this might have social consequences you need only look at things like community bans on genetic research have resulted from social policy implemented as a result of public distrust. When I was interviewed by a local news agency the questions asked were on the order of "Does this mean that someone could fix a Presidential Election?" (sure, Daley did it in Chicago but he didn't used computers!), and "What implications does this have for the nation's defense?" (In spite of reassurances from here and CMU, the local media still insisted on the headline "Defense Computers invaded by virus.") Second, there is an economic conseqence. Since we were unable to determine the extent of the programs activities we were forced to commit programmers time to installing kernel fixes, rebuilding systems, checking user data files, and checking for other damage. That was the direct cost. The indirect cost comes from the delay in other tasks that was incurred by the diversion of people's time to solving this one. If you multiply by the effort that is going on at a number of other sites I suspect that in salary time, alone, you are looking at costs into the hundreds of thousands of dollars. Perhaps, most importantly, there is the academic costs. I would argue that that the popularity of Unix, today, is due in great part to the development of the Berkeley Software Distribution which was made available in source form to thousands of research and academic organizations starting in the '70s. In a sense, it is a community designed system and although Berkeley deserves the lion's share of the credit, it was the contribution of hundreds of users with access to source codes that allowed the system to evolve in the way that it did. There is a cost to providing an academic environment and there are responsibilities that are imposed by it. One advantage of academic is access to information which would not be tolerated in an industrial domain. This access requires our users to observe some code of behavior in order to guarantee that everyone will have the same access to the same information. The person who rips out the pages of an article from a library journal is abusing this privilege of free access to information and depriving others of the same. By convention, we agree not to do that, and so we protect that system that has benefited us so that others derive the same benefit. A great part of the Internet was funded by DARPA because some forward thinking individuals recognized the tremendous technological and academic benefits that would be derived from this open network. This has resulted, I believe, in significant economic benefits to American industry and continues to support our leadership role in software development. It is an an infrastructure that supports a gigantic technological community and there are very few, if any, computer interests in this country that were influenced by DARPA' experiment. Within a week or two, members of the organizations responsible for this network are going to be meeting to discuss the implications of the recent virus(es), and mechanisms with which they can be dealt. One possible outcome would be increased restrictions on access to the network (the Defense Research Network is already moving along these lines). It would not be unreasonable to consider whether a venture such as this should be supported, at all. To restrict access to a network such as this, or to remove the network, altogether, would be the economic equivalent to tearing up the Interstate highway system. The effect on academic and technological advancement would be quite serious. The bottom line being that to suggest that program such as the "virus" (which is really more of a Trojan Horse), was little more than a harmless prank is to overlook what the long term effects of both the technology, and the PUBLICATION of that technology will have on continued academic freedom and technological growth. But what of the nature of the act? Is there something to be said of that? First, there is the personal tragedy, here. There is public humiliation for the (supposed) perpetrator's father who is, himself, a computer security expert (his employer's must be questioning whether the son had access to specialized information though most of us realize that the holes that were exploited were well known). There is the jeopardy of the academic career for the programmer. But there is more than that. There seems to be a real lack of consideration for what are the ethical considerations of this action. Consider, for a moment, that you are walking down the street and the person in front of you drops a 10 dollar bill. You have three options: 1) You can pick it up and hand it to them; 2) You can pick it up and keep it; 3) You can leave it and continue walking. It should be obvious that these choices are not morally equivalent. To have known about the holes in the system which allowed the virus in (and even to have known how to exploit these), is NOT the same as actually doing it (any more than leaving the bill on the sidewalk is the same as pocketing it). Somewhere along the line, we fail ourselves and our students if we don't impress upon them the need to regard the network as a society with rights, responsibilities, and a code of professional ethics which must be observed in order to preserve that society. There are probably a few hundred people who could have written the code to do what this virus did; most of those people didn't do it. Most, if not all, of us have had the opportunity to pocket a candybar from the local convenience store, but most of us don't. We don't, not because we will be punished or because there are laws against it, but because we have a social consciousness which tells us that such an action would, in the end, would substantially degrade the society in which we live. What happened in this situation reflects not only a moderately high level of programming sophistication but also a disturbingly low level of ethical maturity. If we tolerate those who view the network as a playground where anyhting goes, we are going to be faced with serious consequences. But the answer is not to change the character of the network (by increasing restrictions and decreasing freedom of access), but to promote a sense of character among the members of the community who work and experiment in this network. This puts the burden on us to remember that there is a need for us to encourage, teach, and provide examples of the kind of behaviors that we need to preserve in order to preserve the network. Sean McLinden Decision Systems Laboratory University of Pittsburgh
peter@ficc.uu.net (Peter da Silva) (11/06/88)
One side effect that I don't like is that UNIX is taking the blame for a combination of (1) a security hole in an application (sendmail), and (2) deliberate loosening of security to trusted sites (rhosts, etc...). Non-academic UNIX in general is a lot less open to techniques like this. -- Peter da Silva `-_-' Ferranti International Controls Corporation "Have you hugged U your wolf today?" uunet.uu.net!ficc!peter Disclaimer: My typos are my own damn business. peter@ficc.uu.net
chend@beasley.CS.ORST.EDU (Donald Chen - Microbiology) (11/07/88)
In article <1698@cadre.dsl.PITTSBURGH.EDU> sean@cadre.dsl.PITTSBURGH.EDU (Sean McLinden) writes: >Now that the crime of the century has been solved and all of the >bows have been taken it is, perhaps, time to reflect a little more >on the implications of what has happened. > text deleted > >Let's look, for a moment, at the effects of this system (whether >intended or otherwise). First, it satisfied a public desire for news >and, one might argue, served as a reassurance to the many technophobes >out there that our systems are as vulnerable as error prone as they, >all along, have been arguing. If you don't think that this might have >social consequences you need only look at things like community bans >on genetic research have resulted from social policy implemented as >a result of public distrust. When I was interviewed by a local news Are you suggesting that the "public" does not have an interest and responsibility to ask for suitable safeguards from what "they" consider to be either dangerous or incompletely thought out? Although people like Jeremy Rifkin have been nuisances to the practical application of bio-engineered tools, they have also caused investigators to more completely think out their studies, AND have forced scientists to explain and defend their approaches and tools to the people who ultimately fund their research. >Second, there is an economic conseqence. Since we were unable to >determine the extent of the programs activities we were forced to >commit programmers time to installing kernel fixes, rebuilding systems, >checking user data files, and checking for other damage. That was >the direct cost. The indirect cost comes from the delay in other Perhaps I am foolish, but I feel some of the responsibility goes to whoever left the debug option in sendmail, and to those who allow promiscuous permissions in their systems. > >If we tolerate those who view the network as a playground where >anyhting goes, we are going to be faced with serious consequences. But >the answer is not to change the character of the network (by increasing >restrictions and decreasing freedom of access), but to promote a sense >of character among the members of the community who work and experiment >in this network. This puts the burden on us to remember that there >is a need for us to encourage, teach, and provide examples of the >kind of behaviors that we need to preserve in order to preserve the >network. > You talk of personal responsibility -to oneself, to one's colleagues, to one's community - and I heartily agree; however, you also talk of the burden we all have to somehow teach and instill in others that sense of rightness which makes the net possible. This does not insure that those whom we teach will listen, and even if they do, that they will do it right away. Perhaps there is an analogy to children who, though they have been told to "do right", test the limits of their freedom, test the extent of their personal strengths. We hope that through time and experience these children will grow to become an integral part of their communities - but it takes time. I do not wish to condone the actions of anyone who disrupts the net or rips out pages from library books or trashes the environment in which we all live. Although our site has not seen evidence for this particular virus, it will, no doubt, be the victim of others. In that vein, we need to protect our site from the thrashings of either childish behaviour or cynical attacks. This means we treat our sites more protectively - viz. the family heirloom - yet no so much that growth and evolution of the system is stifled. I suspect that part of the openess and collegiality which we would like pays its price in these attacks. We can only muted the number and intensity of them Don Chen Dept of Microbiology Oregon State University
gwyn@smoke.BRL.MIL (Doug Gwyn ) (11/10/88)
In article <1698@cadre.dsl.PITTSBURGH.EDU> sean@cadre.dsl.PITTSBURGH.EDU (Sean McLinden) writes: >(In spite of reassurances from here and CMU, the local media still >insisted on the headline "Defense Computers invaded by virus.") The media was right. For example, VGR.BRL.MIL was inoperative for days while we studied the virus here (since that system had already been infected). VGR.BRL.MIL plays a key role in several projects that are important to the national defense. Other military sites are known to have been affected. Fortunately we have been able to characterize the behavior of the virus and now know that it did not alter critical databases (for example). >Second, there is an economic conseqence. Since we were unable to >determine the extent of the programs activities we were forced to >commit programmers time to installing kernel fixes, rebuilding systems, >checking user data files, and checking for other damage. We spent our time instead determining the exact extent of the virus's abilities. As a result we found that we did not need to worry about the effects of Trojan horses, etc. (which could well have been part of what the virus/worm did, although we were lucky this time). >"virus" (which is really more of a Trojan Horse), ... It could be considered a "worm" but not meaningfully a Trojan horse. It had the opportunity to install Trojan horses but didn't do so. >low level of ethical maturity. So where is the student to learn better? The current culture is founded more on the philosophy of pragmatism than anything else, and accordingly the student is encouraged in his belief that nearly anything is okay so long as he doesn't get caught. If you want to establish rational values as the norm, you have your work cut out for you. It's a worthwhile goal, but won't be accomplished quickly.
gwyn@smoke.BRL.MIL (Doug Gwyn ) (11/10/88)
In article <2151@ficc.uu.net> peter@ficc.uu.net (Peter da Silva) writes: >One side effect that I don't like is that UNIX is taking the blame for >a combination of (1) a security hole in an application (sendmail), and >(2) deliberate loosening of security to trusted sites (rhosts, etc...). >Non-academic UNIX in general is a lot less open to techniques like this. The virus exploited two security holes in Berkeley-supplied servers. We found that several commercial offerings that included this software had done little more that stick their own label on it; they did not go over the code and fix its problems before releasing it. In fact, in the case of sendmail, they didn't even turn off the DEBUG flag in the Makefile. The technical problems that were exploited were mostly sloppiness that nobody had reviewed and corrected in time. We know of a few other similar security holes that the virus didn't try to exploit. One could also challenge the design that provides privileged access via sockets and their servers without adequate authentication. The lessons to be learned are not overly simple, and until they have been thoroughly assimilated by the right people, you can be assured that there are more security holes of the same general nature. Try the following on your favorite remote 4BSD-based system: rlogin host -l '' This attack works a surprising percentage of the time. The problem that provides the hole has been known for many years and was fixed at least as long ago as 1984 in the AT&T-supplied UNIX variants. But it persists in the Berkeley variants. Perhaps this note will prompt the various vendors to finally fix this problem! The REAL problem is that too many people just do not care about security, probably because they don't understand how it affects them.
chris@mimsy.UUCP (Chris Torek) (11/10/88)
In article <8845@smoke.BRL.MIL> gwyn@smoke.BRL.MIL (Doug Gwyn ) writes: >The technical problems that were exploited were mostly sloppiness that >nobody had reviewed and corrected in time. We know of a few other >similar security holes that the virus didn't try to exploit. Well, good grief, SEND THEM TO US. WE *WILL* FIX THEM. This is a large part of what comp.bugs.4bsd.ucb-fixes is about. (Or do you mean that they are fixed in 4.3tahoe but not other 4BSD-derived systems?) >Try the following on your favorite remote 4BSD-based system: > rlogin host -l '' I get: `Password:' Obviously this one has been fixed in 4.3tahoe. -- In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7163) Domain: chris@mimsy.umd.edu Path: uunet!mimsy!chris
gwyn@smoke.BRL.MIL (Doug Gwyn ) (11/11/88)
In article <14465@mimsy.UUCP> chris@mimsy.UUCP (Chris Torek) writes: >In article <8845@smoke.BRL.MIL> gwyn@smoke.BRL.MIL (Doug Gwyn ) writes: >>The technical problems that were exploited were mostly sloppiness that >>nobody had reviewed and corrected in time. We know of a few other >>similar security holes that the virus didn't try to exploit. >Well, good grief, SEND THEM TO US. WE *WILL* FIX THEM. This is a >large part of what comp.bugs.4bsd.ucb-fixes is about. (Or do you mean >that they are fixed in 4.3tahoe but not other 4BSD-derived systems?) Last time I tried, there was a distinct lack of interest! >>Try the following on your favorite remote 4BSD-based system: >> rlogin host -l '' >Obviously this one has been fixed in 4.3tahoe. Not necessarily. Try the following: # vi /etc/passwd <insert an extra blank line, say at the end> $ passwd <change your password, say to the same thing it already is> $ su '' # suprise! If this hole exists, it can be traced to getpwent() not being careful enough when it parses /etc/passwd records. See UNIX System V for the simplest fix.
gwp@hcx3.SSD.HARRIS.COM (11/11/88)
Written 5:40 pm Nov 8, 1988 by scott@attcan.UUCP (Scott MacQuarrie) > There is a product available from AT&T's Federal Systems group called > MLS (Multi-Level Security) which provides B1-level security in a System V > Release 3.1 environment. I have seen the product on a 3B2, it's availablity > from other vendors would probably require work by those vendors. It did. It's done. It's called CX/SX. Gil Pilz -=|*|=- Harris Computer Systems -=|*|=- gwp@ssd.harris.com
jfh@rpp386.Dallas.TX.US (John F. Haugh II) (11/11/88)
In article <8845@smoke.BRL.MIL> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes: >The REAL problem is that too many people just do not care about >security, probably because they don't understand how it affects >them. I don't recall whether it was Doug Gywn or Guy Harris who last complained when I attacked the CSRG and BSD. Do you *really* trust college students to write real software? If so, you must have never attended a university similiar to the one I graduated from. -- John F. Haugh II +----Make believe quote of the week---- VoiceNet: (214) 250-3311 Data: -6272 | Nancy Reagan on Artifical Trish: InterNet: jfh@rpp386.Dallas.TX.US | "Just say `No, Honey'" UucpNet : <backbone>!killer!rpp386!jfh +--------------------------------------
sean@cadre.dsl.PITTSBURGH.EDU (Sean McLinden) (11/12/88)
>In article <8845@smoke.BRL.MIL> gwyn@smoke.BRL.MIL (Doug Gwyn ) writes: >Try the following on your favorite remote 4BSD-based system: > rlogin host -l '' Or, to keep someone else from doing this, remove lines like: ::0:0:: from your password file. Most Sun systems have this as a default (stupid!). The alternative is to fix your login prograns which you might not be able to do with a binary license. Or, you could run MACH (Version 3 will have NFS). Sean McLinden Decision Systems Laboratory
guy@auspex.UUCP (Guy Harris) (11/12/88)
>If this hole exists, it can be traced to getpwent() not being careful >enough when it parses /etc/passwd records. See UNIX System V for the >simplest fix. If that fix is "have 'getpwent()' return NULL if the entry it looks at is syntactically incorrect," the fix is simple but rather rude; the net result is that any program scanning the password file linearly - e.g., "passwd" - will think it's at the end of the file if it sees such a syntactically incorrect line. Having "passwd" cut off the password file as soon as it sees a blank line isn't very nice; ignoring the syntactically-invalid lines, or passing them through unchanged, is probably a better idea. The former could be done by having "getpwent" skip over those entries, rather than return NULL on them; the latter requires that "passwd" not just naively use "(f)getpwent" and "putpwent" to update the password file.
mike@stolaf.UUCP (Mike Haertel) (11/13/88)
In article <8562@rpp386.Dallas.TX.US> jfh@rpp386.Dallas.TX.US (John F. Haugh II) writes: >Do you *really* trust college students to write real software? If so, you >must have never attended a university similiar to the one I graduated from. I am a college student. Also the author of GNU grep, coauthor of GNU diff, and working on GNU sort . . . all of my programs are faster and (I hope) more robust than the Unix programs they replace. I am glad to hear that you don't trust me to write real software, and that you will not be using my programs. Do you really trust your vendor to write real software? Most of them won't distribute source, so you can't check for trojan horses et. al. You can't fix bugs that arise, unless you are good at reading and patching binaries. Most of them have license agreements that prevent you from doing this, if you are a person who keeps your word. I have heard that the reason some vendors don't distribute source is that they don't want their customers to see how badly written it is. --- Mike Haertel Really mike@stolaf.UUCP, but I read mail at mike@wheaties.ai.mit.edu.
guy@auspex.UUCP (Guy Harris) (11/13/88)
>Or, to keep someone else from doing this, remove lines like: > >::0:0:: > >from your password file. Most Sun systems have this as a default >(stupid!). Excuse me, but to what are you referring? Most Sun systems have a line like +::0:0::: as a default, but this is INequivalent to ::0:0:: Lines of the latter sort are generated by the scenario Doug Gwyn described; the problem is that "getpwent" doesn't, in some systems, check that the login name field is non-null before returning a value. (S5R3's version checks, but unfortunately returns NULL rather than skipping the invalid entry, which causes programs to think a blank line in "/etc/passwd" is really the end of the file.)
sean@cadre.dsl.PITTSBURGH.EDU (Sean McLinden) (11/13/88)
:In article <445@auspex.UUCP: guy@auspex.UUCP (Guy Harris) writes:
::Or, to keep someone else from doing this, remove lines like:
::
::::0:0::
::
::from your password file. Most Sun systems have this as a default
::(stupid!).
:
:Excuse me, but to what are you referring? Most Sun systems have a line
:like
:
: +::0:0:::
:
:as a default, but this is INequivalent to
:
: ::0:0::
:
Excuse ME, but the last four lines of my SunOS 4.0 distribution tape
password file are:
+::0:0:::
::0:0:::
::0:0:::
::0:0:::
NeXT?
Sean McLinden
Decision Systems Laboratory
madd@bu-cs.BU.EDU (Jim Frost) (11/14/88)
In article <772@stolaf.UUCP> mike@wheaties.ai.mit.edu writes: |In article <8562@rpp386.Dallas.TX.US> jfh@rpp386.Dallas.TX.US (John F. Haugh II) writes: |>Do you *really* trust college students to write real software? If so, you |>must have never attended a university similiar to the one I graduated from. Actually, those students who produce code often do a better job than 'professionals', mostly because they have the time to do it right. Professionally written software is most often pushed out the door, which isn't likely to help its quality. I could cite examples of this, but you have probably seen it as often as I have anyway. Another thing that happens with professionally produced software is the author deliberately making it hard to follow (read: modify and debug) in order to ensure his (her) job security (kind of reminds me of Bush picking Quayle, come to think of it :-). Again, not something a student, writing on his own, is likely to do. Would I trust student-written code? You bet your life I would, but only after giving it a little personal attention, something that should always be done anyway. BTW, I would be interested in knowing what exactly constitutes a "student". A good many people I know, myself included, write things professionally as well as go to school. Should you only trust those things I write while I'm at work? The questions could go on and on.... jim frost madd@bu-it.bu.edu
carroll@s.cs.uiuc.edu (11/14/88)
/* Written 2:37 pm Nov 12, 1988 by mike@stolaf.UUCP in s.cs.uiuc.edu:comp.unix.wizards */ I have heard that the reason some vendors don't distribute source is that they don't want their customers to see how badly written it is. --- Having seen some proprietary source code that made me gag, I wouldn't be at all surprised at that. Alan M. Carroll "How many danger signs did you ignore? carroll@s.cs.uiuc.edu How many times had you heard it all before?" - AP&EW CS Grad / U of Ill @ Urbana ...{ucbvax,pur-ee,convex}!s.cs.uiuc.edu!carroll
gwyn@smoke.BRL.MIL (Doug Gwyn ) (11/14/88)
In article <439@auspex.UUCP> guy@auspex.UUCP (Guy Harris) writes: >If that fix is "have 'getpwent()' return NULL if the entry it looks at >is syntactically incorrect," the fix is simple but rather rude; That would be rude but it wasn't what I was talking about. All you really need to do is to skip over a bogus entry and resynchronize for the next one.
rbj@nav.icst.nbs.gov (Root Boy Jim) (11/15/88)
Doug, Sometime awhile back (this spring, summer?), I remember someone's comment regarding which sources contained the routine `gets', the routine used to subvert fingerd. I recall you thanking the poster and stating your intention to eradicate it from your System V emulation code. I applaud you for your foresight, sharing your distaste for this beast. You may very well have saved yourself from one prong of the fork. I can imagine you crusading against gets() in both the C and POSIX standards and I hope you have had success in that area. I would go so far as to suggest that everyone remove this routine from libc.a and place it in a separate library available only upon special request for binary applications only, after filling out numerous forms. I can see it now, a paper entitled `Local Variables Considered Harmful'. (Root Boy) Jim Cottrell (301) 975-5688 <rbj@nav.icst.nbs.gov> or <rbj@icst-cmr.arpa> Careful with that VAX Eugene!
gwyn@smoke.BRL.MIL (Doug Gwyn ) (11/15/88)
In article <17519@adm.BRL.MIL> rbj@nav.icst.nbs.gov (Root Boy Jim) writes: >I can imagine you crusading against gets() in both the C and POSIX >standards and I hope you have had success in that area. I would go >so far as to suggest that everyone remove this routine from libc.a >and place it in a separate library available only upon special request >for binary applications only, after filling out numerous forms. Although I probably voted to remove gets() from the proposed C standard, I will stand by X3J11's decision to leave it in. As explained in discussions raging in comp.lang.c (INFO-C), there are safe uses for gets(), its "problem" is well known, there are several other standard library routines with similar characteristics, and a lot of existing code uses it (sometimes safely, sometimes not). People are focusing on the wrong problem. The Internet virus also attacked through a hole unrelated to gets(), and I know of at least three other such holes. The general problem is lack of sufficient attention to detail in security-related code. You're not going to solve this by outlawing a sometimes useful tool.
zjat02@cra2.uucp (Jon A. Tankersley) (11/15/88)
I've read too much not to comment..... The question is not actually 'can you trust any university student', but 'can you trust any person'. The answer is yes and no. Short of getting some crack programmers together and brainwashing them. But even then it would be difficult, they could turn on you. Anybody is culpable. Anyone can be 'broken'. Maturity has nothing that makes it more reliable. There are/were some University students that I can/would trust to write clean code. This is because of the 'more than cursory' knowledge of the people in question. After working with them for 4 years, I knew what their morals and ideals were. I also knew the other type, that you couldn't trust to give you the correct time. But, even these people I could trust could/can be broken and subverted. And that is not a crime. That is human nature. Given the correct type of hard choices, anyone can be subverted. But this doesn't deal with the issue. Ethics is something learned from day 1. Education on ethics points out some of the problems when dealing with ethics, but it doesn't teach ethics. Scruples are learned also. Beyond the ancient form of measure, there is no education for scruples. But it also takes discipline. Discipline to document what is really going on. Discipline to get it done the right/correct/best way. Discipline to not be seduced by 'creeping featurism' (a seduction/subversion listed above). There will always be bugs and loopholes. Security is not a passive function. But it is often treated that way. Fix it when something slips. Active Even I am 'guilty' of letting security lapse, partially due to ignorance and partially due to lack of time to devote security auditing. Even with all of the C1-B2 auditing going on, it is still an active job. If nobody ever looks at the logs..... then there is no security. The biggest result of the 'Attack of the Hungry Worm' will be a clamping down on the ease of use of networking. New 'conveniences' will be developed with new 'features' that will present new 'loopholes' in the never ending seesaw battle between 'good and evil' (convenience and security). Sigh... Back to work. Standard disclaimers, etc, etc, etc. and to be redundant etc. -tank- #include <disclaimer.h> /* nobody knows the trouble I .... */
zjat02@cra2.uucp (Jon A. Tankersley) (11/15/88)
I just checked numerous 4.0 systems that I have. Narry a one had a missing '+'. Are you sure you checked the installation log? I am pretty confident that there never were any stray ::0:0.... entries in any of mine, I use scripts to update my passwd files, and they are pretty dumb.... -tank- #include <disclaimer.h> /* nobody knows the trouble I .... */
guy@auspex.UUCP (Guy Harris) (11/15/88)
>Excuse ME, but the last four lines of my SunOS 4.0 distribution tape >password file are: > > +::0:0::: > ::0:0::: > ::0:0::: > ::0:0::: Ex*cuse* me, but I just looked at the password file on the Sun-3 and Sun-4 1/2" distribution tapes, both on the "root file system" tar file and in the "Install" optional software component (because it contains the "prototype" used to install diskless clients). All of them had +: as the last line in the password file (in fact, I'll bet the password files were identical). No blank lines, and certainly no ::0:0::: I tried "passwd" with a last line like the one above, and it merely turned it into +::0:0::: filling in the missing fields; it didn't insert a ::0:0::: line. Now, I can't speak for: 1) the 1/4" distribution tapes, as we don't have them handy, although I would be *EXTREMELY* surprised if they were any different. 2) the Sun-2 distribution tapes; see 1) 3) the Sun386i but I don't see any indication that the password file, as shipped or set up by Sun, has any ::0:0::: lines in it.
guy@auspex.UUCP (Guy Harris) (11/16/88)
>In article <439@auspex.UUCP> guy@auspex.UUCP (Guy Harris) writes: >>If that fix is "have 'getpwent()' return NULL if the entry it looks at >>is syntactically incorrect," the fix is simple but rather rude; > >That would be rude but it wasn't what I was talking about. >All you really need to do is to skip over a bogus entry and >resynchronize for the next one. It may not be what you were talking about, but it's what AT&T did, at least in S5R3 and S5R3.1. As I think I stated, skipping bogus entries and proceeding to the next valid entry is, indeed, the right fix.
rbj@nav.icst.nbs.gov (Root Boy Jim) (11/16/88)
? Well, now, gets() is of course unsafe, but then there are ? read(), sprintf(), and no telling how many others. For that ? matter, *p++ = *q++ Not quite in the same way. Read takes an argument which specifys the maximum size of the buffer, no problem. Copying a string (*p++ = *q++), while a frequent source of bugs, is possible to control since strlen will tell you the length. Likewise sprintf; with a little care one can precompute the size and reserve a large enuf area. One problem the latter two have is with segmentation violations. However, with gets(), one is totally at the mercy of data that is outside the program, and thus, not under control. ? haynes@ucscc.ucsc.edu ? haynes@ucscc.bitnet ? ...ucbvax!ucscc!haynes ? "Any clod can have the facts, but having opinions is an Art." ? Charles McCabe, San Francisco Chronicle (Root Boy) Jim Cottrell (301) 975-5688 <rbj@nav.icst.nbs.gov> or <rbj@icst-cmr.arpa> Careful with that VAX Eugene! I can't think about that. It doesn't go with HEDGES in the shape of LITTLE LULU -- or ROBOTS making BRICKS...
drears@ardec.arpa (Dennis G. Rears (FSAC)) (11/16/88)
John F. Haugh II writes >Do you *really* trust college students to write real software? If so, you >must have never attended a university similiar to the one I graduated from. What do you have against students? I guess this says something negative about you (as a previous student), current students, and your university. The only people I distrust from the start are felons, known crackers, and politicians :-). Trust is something that builds over time; not because one isn't a student. Dennis
mohamed@popvax.harvard.edu (Mohamed Ellozy) (11/16/88)
In article <8890@smoke.BRL.MIL> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes: > >People are focusing on the wrong problem. The Internet virus also >attacked through a hole unrelated to gets(), and I know of at least ^^^^^^^^^^^^^^^^^^^^^^ >three other such holes. The general problem is lack of sufficient ^^^^^^^^^^^^^^^^^^^^^^ This is what irritates the living daylights out of so many of us. He "knows" of at least three other such holes. He is thus more learned, perhaps even wiser, than we are. BUT WHAT THE HELL ARE YOU DOING TO GET THEM CLOSED??? Wizards who "know" about problems and pride themselves about it, but do nothing, are little better than those who mailiciously exploit them. This wormy episode will only prove useful if it leads to a serious effort to eradicate existing holes. I suspect that vendors will now be very sensitive (for a short period of time) to reports of security problems. Not too sure, though. What have various vendors done for sites which run anonymous ftp? Expecting customers to learn of problems from the net is not acceptable user support.
dlm@cuuxb.ATT.COM (Dennis L. Mumaugh) (11/16/88)
In article <8844@smoke.BRL.MIL> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes: >So where is the student to learn better? The current culture is >founded more on the philosophy of pragmatism than anything else, >and accordingly the student is encouraged in his belief that >nearly anything is okay so long as he doesn't get caught. > >If you want to establish rational values as the norm, you have your >work cut out for you. It's a worthwhile goal, but won't be >accomplished quickly. This is a short essay on the mores and morals of the computer culture. It is caused by the controversy over whether the originator of the recent internet worm should be hailed as a hero or hauled off to jail and his life and career ruined by his actions. There are two attitudes towards life that are exemplified by various social systems. In an authoritarian/totalitarian society that which is not permitted is forbidden. In a "free society" that which is not prohibitted is permitted. In the computer culture we have similar attitudes. Some people feel that since UNIX has file permissions, if you don't protect your files they should be able to browse them (and if your terminal is not locked they can use it and browse). Others feel that personal directories and files are out of bounds. Part of this culture clash comes from the differences between the "academic community" and the "business community". I remember back in 1967 when a Freshman student of physics was making a nuisance out of himself with the University of Maryland Computer Center by breaking the operating system and stealing time. He lead the systems people a merry chase. They finally stopped the activities by hiring him as a systems programmer. Today that person is famous as the inventor of <product deleted> and was a professor of a well known academic institution. [His name is deleted because he is now a well known person, but I knew him way back when.] Today, the same actions would result in disciplinary action and since the advent of the new federal law on computer security would be cause for criminal action. What was once considered a harmless prank is now a "serious" offense. What has changed? Computers have changed. They used to be toys of the privileged few researchers and now they are the work horses of the world. The analogy is that between horses and the current automobile. In the old days borrowing a horse for a bit wasn't that serious, nowadays joy riding in a car is a major offense. [We did hang horse thieves though didn't we?] Our academic community encourages browsing and "snooping" as long as we don't destroy or conceal the origination of ideas [plaigarism]. The ideal of co-operation between people and the spread of knowledge is generally taught as the highest goal. Our business community is just the opposite. We have found that information is power is money. The FSF to the contrary, computer data is now valuable [I rememeber trying to get a mag tape through Candian Customs: those who said "Computer Data" paid duty; those who said "Software" got by for free]. As more and more people commit their fortunes and lives [figuratively] onto computer media, the more we will become intolerant of people who disrupt those computers or idlely browse through files. In another newsgroup [news.sysadmin or some such] the question was raised: "What authority does a systems administrator have to browse files." I can remember some times when I happened upon a torrid love affair being conducted by two married people via EMail, and .... today I would almost be required to inform authority of this abuse of computer resources. Essentially what Doug is raising and I am seconding is that times have changed. This worm incident has rattled some cages and arroused some sleeping dragons. Hopefully, the Professional Societies will provide a code of ethics about computer use in reference to these areas. If they don't the US Government will. Already the new law could be used to charge rogue players with a crime [unauthorized use of facilities]. Then of course those who read netnews without official sanction ..... I suspect that one could even make a case for routing personal mail over the Internet as being a crime. -- =Dennis L. Mumaugh Lisle, IL ...!{att,lll-crg}!cuuxb!dlm OR cuuxb!dlm@arpa.att.com
gwyn@smoke.BRL.MIL (Doug Gwyn ) (11/17/88)
In article <270@popvax.harvard.edu> mohamed@popvax.UUCP (R06400@Mohamed Ellozy) writes:
-This is what irritates the living daylights out of so many of us.
-He "knows" of at least three other such holes. He is thus more
-learned, perhaps even wiser, than we are.
- BUT WHAT THE HELL ARE YOU DOING TO GET THEM CLOSED???
-Wizards who "know" about problems and pride themselves about it, but
-do nothing, are little better than those who mailiciously exploit them.
The BSD developers know of all three holes and have published fixes for
two of them. BRL's network host tester will probe for them and inform
system administrators if they have these holes.
I'm waiting for an apology.
gwyn@smoke.BRL.MIL (Doug Gwyn ) (11/17/88)
The problem is, ethics and legality have little logical connection with each other. One does not solve an ethical problem by passing crime laws. To take Mumaugh's example of playing "rogue", it IS technically a crime to do so with Federal facilities. However I am sure that this has not much deterred people from doing it. And one might wonder whether it is even unethical, much less something criminal. If the Federal bureaucracy were properly based on hierarchical authority/responsibility/accountability, then when a supervisor decided that such "abuse" was benign or even beneficial, it should be allowed. Unfortunately that is not the way the Civil Service operates, especially in peacetime. No, ethics and morality need to be self-motivated.
jbn@glacier.STANFORD.EDU (John B. Nagle) (11/17/88)
The legal issues will be interesting. It's not at all clear that Morris ever "accessed" a Government computer. He may have induced computers owned by others to do so, but the legal implications of such an act are not clear. The case, assuming it ever comes to trial, will break significant new legal ground. John Nagle
jb@cs.brown.edu (11/18/88)
There are a couple other points where problems similar to gets() overflowing its buffer might arise. Normal usage of scanf() and fscanf() can lead to the same problem if trying to read a string in from someplace. It is easy to specify the buffer size in the format, but I have rarely seen this done. For setuid programs, curses leaves this same type of hole open with several of its input routines. There are routines like both gets() and scanf(). The issue of strcpy() and sprintf() can be worked around, but read code that uses them and you will find that most programmers do not put all the careful checks to make sure that the buffer is not over run. Maybe a good reminder of this problem is needed to get people to clean up. Jim Bloom Brown University jb@cs.brown.edu uunet!brunix!jb
sean@cadre.dsl.PITTSBURGH.EDU (Sean McLinden) (11/18/88)
In article <8909@smoke.BRL.MIL> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes:
:The problem is, ethics and legality have little logical connection
:with each other. One does not solve an ethical problem by passing
:crime laws.
:[deleted]
:No, ethics and morality need to be self-motivated.
Possibly, but social consciousness is learned. Children aren't born with
a sense for what is right and wrong, they are educated in that area. Most
of the education comes from personal experience: you touch a hot stove
only once. Insofar as what harms other people, we start out with a system
of rules which are replaced by reason when the child has enough experience
to make sense of it. An example is respect for personal property. Have you
ever known a three year old who DIDN'T think that everything was his/hers
to play with? Until they can appreciate the concept of individuality and
stop defining the world in terms only of their own existence, children
cannot understand that some things in their world are other people's
personal property and should be treated, accordingly. This is learned,
it is not divined by the soul.
One problem (sic) with an open academic computing environment is the
fact that real world experience does not contain enough parallels to
allow people to reason about appropriate behavior. At least one can
say that if they do exist they are not obvious to everyone. There is
a perception that whatever a (computer) system allows you to do is
acceptable ("If I'm not allowed to run 32 processes simultaneously
why is MAXPROC defined to be 32?"; "If it isn't 'fair' for me to fire
up 12 LISP jobs in the background why does the shell support '&' ?").
There are also less obvious consequences of behavior that need to be
taught. The solitary programmer often has no knowledge of the administrative
issues surrounding the operation of a facility and the allocation of
resources in that community. How many people who have access to ARPANET
have read the ARPANET policy manual (how many copies of it are there
at YOUR institution)? Many rules of conduct in a programming environment
develop from the experience of people who functioned, for a time, in
a society without such rules. Before British colonialism, much of the
U.S. wilderness was lawless. Social rules and laws evolved from that
pioneer spirit because someone determined that these rules would be
needed in order to support a society. In many cases, generations of
experience were needed before an appropriate formalism existed.
I would agree with the claim that you don't make a person behave
ethically by exposing them to ethics. But you can, at least, provide
an background which will allow them to understand why certain social
conventions exist. Many of these would not be obvious to everyone,
which is the justification for doing it in the first place.
Sean McLinden
Decision Systems Laboratory
trn@aplcomm.jhuapl.edu (Tony Nardo) (11/19/88)
In article <8908@smoke.BRL.MIL> gwyn@brl.arpa (Doug Gwyn (VLD/VMB) <gwyn>) writes: >In article <270@popvax.harvard.edu> mohamed@popvax.UUCP (R06400@Mohamed Ellozy) writes: >-This is what irritates the living daylights out of so many of us. >-He "knows" of at least three other such holes. He is thus more >-learned, perhaps even wiser, than we are. >- BUT WHAT THE HELL ARE YOU DOING TO GET THEM CLOSED??? > >The BSD developers know of all three holes and have published fixes for >two of them. BRL's network host tester will probe for them and inform >system administrators if they have these holes. I don't mean to sound facetious, but I seem to recall some news article mentioning that there were 60,000+ nodes on the Internet. Let's assume that only 5% of these systems use some flavor of 4.* BSD. Let's also assume that only 40% of those systems have administrators who wish to have those holes identified and (possibly) plugged. Does BRL have the facilities to test 1200+ nodes before some other clever person develops a copycat "infection"? Or even distribute a "hole test kit" to that many sites? There *must* be a better way to distribute information on how to check for these holes than to have every Internet site queue up for BRL's test... Tony P.S. To Mohamed: if you discovered one of these holes, and realized that a second worm could very easily be written to exploit it, what would *you* do? Actually, anyone may feel free to answer this. Please reply to me by E-mail. I'll attempt to summarize. ============================================================================== ARPA, BITNET: trn@aplcomm.jhuapl.edu UUCP: {backbone!}mimsy!aplcomm!trn "Always remember that those who can, do, and that those who can't, teach. And those who can't teach become critics. That's why there're so many of them." PORTRAIT OF THE ARTIST AS A YOUNG GOD (Stephen Goldin) ==============================================================================
jep@fantasci.UUCP (Joseph E Poplawski) (11/19/88)
In article <48300017@hcx3> gwp@hcx3.SSD.HARRIS.COM writes: >Written 5:40 pm Nov 8, 1988 by scott@attcan.UUCP (Scott MacQuarrie) >> There is a product available from AT&T's Federal Systems group called >> MLS (Multi-Level Security) which provides B1-level security in a System V >> Release 3.1 environment. I have seen the product on a 3B2, it's availablity >> from other vendors would probably require work by those vendors. > >It did. It's done. It's called CX/SX. Can anyone post more information on what these products do to increase system security? Do these require source licenses? -Jo ------------------------------------------------------------------------------- | Joseph E Poplawski (Jo) US Mail: 1621 Jackson Street | | Cinnaminson NJ 08077 | | UUCP:..!rutgers!rochester!moscom!telesci!fantasci!jep | | ..!princeton!telesci!fantasci!jep | | ..!pyrnj!telesci!fantasci!jep Phone: +1 609 786-8099 home | ------------------------------------------------------------------------------- | He who dies with the most toys wins! | ------------------------------------------------------------------------------- | Copyright (C) 1988 Joseph E Poplawski All rights reserved. | -------------------------------------------------------------------------------
jc@minya.UUCP (John Chambers) (11/21/88)
In article <8909@smoke.BRL.MIL>, gwyn@smoke.BRL.MIL (Doug Gwyn ) writes: > The problem is, ethics and legality have little logical connection > with each other. One does not solve an ethical problem by passing > crime laws. To take Mumaugh's example of playing "rogue", it IS > technically a crime to do so with Federal facilities. And it's also a violation of almost all employers' rules (including the government) for an employee to have a picture of his/her family members on their desk. This is, after all, a use of the employer's property for purposes of personal entertainment. Ask any lawyer about the meaning of the phrase "De minimus non curat lex". (Also, while you're at it, ask for the well-known limerick that ends with that line.) -- John Chambers <{adelie,ima,maynard,mit-eddie}!minya!{jc,root}> (617/484-6393) [Any errors in the above are due to failures in the logic of the keyboard, not in the fingers that did the typing.]
jsdy@hadron.UUCP (Joseph S. D. Yao) (11/22/88)
In article <452@auspex.UUCP> guy@auspex.UUCP (Guy Harris) writes: < >Excuse ME, but the last four lines of my SunOS 4.0 distribution tape < >password file are: < > +::0:0::: < > ::0:0::: < > ::0:0::: < > ::0:0::: > ... All of them had > +: >as the last line in the password file ... >I tried "passwd" ... >turned it into > +::0:0::: A lot of people - and some editor programs - have the terrible habit of leaving a blank line at the end of a file after editting it. The 'passwd' program, at least all versions that I know of, tends to turn this into the offending line. This happens because getpwent() returns a blank pwd entry, and putpwent() or the printf() used insert the colons. I'd suggest that all getpwent()'s skip over blank lines completely. Joe Yao jsdy@hadron.COM (not yet domainised) hadron!jsdy@{uunet.UU.NET,dtix.ARPA,decuac.DEC.COM} arinc,att,avatar,blkcat,cos,decuac,dtix,\ ecogong,empire,gong,grebyn,inco,insight, \!hadron!jsdy kcwc,lepton,netex,netxcom,phw5,rlgvax, / seismo,sms,smsdpg,sundc,uunet /
jsdy@hadron.UUCP (Joseph S. D. Yao) (11/22/88)
In article <17519@adm.BRL.MIL> rbj@nav.icst.nbs.gov (Root Boy Jim) writes: >I can see it now, a paper entitled `Local Variables Considered Harmful'. To mis-quote Tom Lehrer, When properly mis-used, everything will lose. Joe Yao