stevesu@copper.TEK.COM (Steve Summit) (06/26/87)
If you have a system you really don't want broken in to, you had better take active steps to prevent it. Trusting people to be reasonable, or crying for laws preventing "hacking," are not really going to help. A while ago a law was proposed that would have made it illegal to monitor the radio bands that cellular and/or cordless phones use. (I don't know if it's in effect now, or if it got repealed.) This was lunacy. If I have sensitive information, that I can't afford to have overheard, then I'm a fool if I broadcast it into the air unencrypted. If somebody listens in, it doesn't really make that much difference whether it was illegal or not; I still get hurt. On the other hand, if I encrypt my data, there's no need to make eavesdropping illegal. Several people have pointed out that this kind of attitude, that the burden is on me to protect my data, is inappropriate and/or tragic, and that in a just and perfect world, I wouldn't have to be so paranoid. They point out that I shouldn't have to bar my door against every conceivable sort of attack, or walk around in armor. However, there is a subtle but extremely significant difference between crime involving property and crime involving information: for many kinds of information crime, _y_o_u _d_o_n_'_t _e_v_e_n _k_n_o_w _y_o_u_'_r_e _b_e_i_n_g _r_o_b_b_e_d. If I leave my front door wide open (as a matter of fact, I do, when I'm at home and the weather is nice), I may be foolish, especially if I live in a high crime area, but at least I can see a burglar walking in, and maybe take steps to defend myself. Even if I don't see the burglar, I will likely discover, sooner than later, that my stereo is missing. Furthermore, a stereo is a tangible item, and I have some (albeit small) chance of recovering it. On the other hand, if somebody breaks in to my computer system (whether the "front door" is wide open or not), I may never know. If major damage is done, I'll notice it right away; if some files are deleted, I'll probably discover it sooner or later, but if files are merely perused, how am I to know that information has been taken? Of course, I could have had more advanced security features on my system, to detect unauthorized access and leave audit trails of file access, but these are the things that people are complaining they shouldn't have to do. I am in complete agreement with people's sentiments about reasonable levels of protection. It would be a terrible commentary on today's society if everybody had to quadruple- lock their doors, or arm themselves like Rambo before taking a walk in the park. (Need I point out, though, that in some communities, both statements are, tragically, already true?) However, computer crime is so new that direct analogies with "physical" crime simply don't apply. For one thing, we don't have a societal consensus on what is reasonable behavior and what isn't, and what you can reasonably expect other people to do (and to restrain themselves from doing), and what you have to protect yourself against. Furthermore, the current level of "security" in many, many computer systems is orders of magnitude below the barest minimum levels of security we take for granted in other areas. In a lot of ways, computer crime is like stealing candy from a baby. Think about it. Stealing candy from a baby is a lot different from stealing stealing stereos from houses, no matter how many locks are or aren't on the door. The baby may be hardly aware of the candy in its hand; it is likely to hand it to you. It doesn't know what possession means. If you know the baby, and you don't think it wants the candy, you don't consider your act "stealing" at all. Computer crime is new and different stuff. If you have a system that's at all important, and you aren't doing basic things to protect it, then I'm sorry, you are burying your head in the sand. Steve Summit stevesu@copper.tek.com
chris@mimsy.UUCP (Chris Torek) (07/02/87)
In article <1167@copper.TEK.COM> stevesu@copper.TEK.COM (Steve Summit) writes: >... However, there is a subtle but extremely significant >difference between crime involving property and crime involving >information: for many kinds of information crime, *you don't even >know you're being robbed.* ... if files are merely perused, how >am I to know that information has been taken? This is indeed the point, and this demonstrates a `bug' in the description of the crime. The information has not been *taken*: It has been *copied*. The way this affects the value of the information depends upon the information itself. Some information is worthless, some priceless; some becomes worthless once copied, and some gains value with every copy. [1] Analogies to thieves breaking into houses are further inappropriate for another reason, one that will, with time, disappear. This is, quite simply, that the proverbial `kid with the modem' who dials a number he got from a friend, sees `login:' and types `guest' and gets in, may never have been told what he *should not* do. The same kid in front of the house's front door (whether open or closed or locked) probably has a very good idea what he should not do. It has been pressed upon him during all the years he has lived in our society. Just as `everyone knows' how to dial a telephone `without being taught', [2] people know what is permitted in other familiar situations. Computer access is often a new situation, in which the familiar limits do not apply. (This does not mean unauthorised access is right. I have personal views that I will not explain here.) ----- [1] An example of the last kind of information is a recommendation that reads `Chris Torek is a wonderful Unix programmer. If you have any Unix problems, hire this guy, whatever his price.' At least, its value to *me* grows with every copy. :-) [2] This is an utter joke to those who have never seen a telephone before. Too, now, in airports, you may have trouble with the new phones. I did. -- In-Real-Life: Chris Torek, Univ of MD Comp Sci Dept (+1 301 454 7690) Domain: chris@mimsy.umd.edu Path: seismo!mimsy!chris