daveb@gonzo.UUCP (Dave Brower) (11/13/88)
In article <5390@medusa.cs.purdue.edu> spaf@cs.purdue.edu (Gene Spafford) writes: >In article <542@dutrun.UUCP> hans@duttnph.UUCP (Hans Buurman) writes: >>Come on, Mr. Spafford. You cannot believe that a course in ethics >>will get each and every undergraduate to live by the rules. And remember, >>it's the individual that we're afraid of, not the group. > >I never claimed a course in ethics (or anything else) will help each >and every undergraduate live by the rules. However, it will help a >significant number of students understand the rules bit better than the >current system does, and that is important... As a data point, I observe that the curriculum required by most Bar Associations for acceditation of law schools includes courses in "Professional Responsibility." My dim recollection is that this was added in the '70s after Watergate in response to the belief that the legal training had failed to instill proper ethics. I don't know if this is seen as a successful innovation. It would be hard to say that lawyers are generally more ethical now than they were generally in 1972. Certainly the public confidence in that profession has not been increased in the aftermath. This is a very difficult issue. To add something to a curriculum means dropping something else. Should we trade "Formal Testing methods" for "Professional Responsibility?" The central issue is public confidence in computer systems and their related formal and informal instutions. It is why Universities take such a hard line on plagarism and why lawyers do get disbarred. This case points questions at the professional/academic computer science community. Is this an isolated case to be dismissed, or an indication of the same general ethical laxity widely believed to exist in the legal profession? It is therefore *most* troubling that the worm-master of the Internet is believed to be a fairly typical hacker/scientist within the academic/professional community. It would be much easier to dismiss if this were the proverbial 14 year old with an Apple-II and a modem. Then the finger wouldn't be pointed at us. And yet, as one previous poster noted, most personal ethical systems are in place before one gets to college. The kid who was a cracker at 14 seems unlikely to be changed by a one semester course at 21. I was tempted to restrict followups to comp.edu, but chose not to. This may very well be the most important discussion that has ever taken place on the network, and it seems unwise to limit it or wish that it would just go away. -dB
spaf@cs.purdue.edu (Gene Spafford) (11/14/88)
In article <460@gonzo.UUCP> daveb@gonzo.UUCP (Dave Brower) writes: >This is a very difficult issue. To add something to a curriculum means >dropping something else. Why do you say that? If we add material on parallel architectures and algorithms, does that mean that we should drop OS? Or if we add a section on functional languages, we should drop any mention of compilers? A curriculum is an evolving thing meant to instruct students both in the important topics and in how to integrate those topics and continue their education. Adding new material does not always mean something else gets dropped. It can mean that some older topics get less emphasis, or it could simply mean that there is another required course added to the core. -- Gene Spafford NSF/Purdue/U of Florida Software Engineering Research Center, Dept. of Computer Sciences, Purdue University, W. Lafayette IN 47907-2004 Internet: spaf@cs.purdue.edu uucp: ...!{decwrl,gatech,ucbvax}!purdue!spaf