[comp.risks] RISKS DIGEST 7.89

RISKS@KL.SRI.COM (RISKS FORUM, Peter G. Neumann -- Coordinator) (12/07/88)

RISKS-LIST: RISKS-FORUM Digest  Tuesday 6 December 1988   Volume 7 : Issue 89

        FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS 
   ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Contents:
  Computer Literacy #4 (Ronni Rosenberg)
  Privacy versus honesty/equality (Jerry Carlin)
  Computerized speeding tickets? (Clifford Johnson)
  Subways that "know" who's on board (Marc J Balcer)
  Automatic toll systems -- Dallas (Andrew R. MacBride)
  "Hackers", "crackers", "snackers", and ethics ("Maj. Doug Hardie")
  `hacker' is already a dictionary entry (Joe Morris, Douglas Jones)
  Re: /dev/*mem and superuser (Jeff Makey)

The RISKS Forum is moderated.  Contributions should be relevant, sound, in good
taste, objective, coherent, concise, and nonrepetitious.  Diversity is welcome.
CONTRIBUTIONS to RISKS@CSL.SRI.COM, with relevant, substantive "Subject:" line
(otherwise they may be ignored).  REQUESTS to RISKS-Request@CSL.SRI.COM.
FOR VOL i ISSUE j / ftp kl.sri.com / login anonymous (ANY NONNULL PASSWORD) /
  get stripe:<risks>risks-i.j ... (OR TRY cd stripe:<risks> / get risks-i.j ...
  Volume summaries in (i, max j) = (1,46),(2,57),(3,92),(4,97),(5,85),(6,95).

----------------------------------------------------------------------

Date: Tue, 6 Dec 88 15:12:11 EST
From: ronni@wheaties.ai.mit.edu (Ronni Rosenberg)
Subject: Computer Literacy #4

What are your reactions to a proposal for a different sort of "computer
literacy" course, described below?  (I am not saying that all schools should
teach such a course.)  Is it a good or bad idea?  Why?  Should the description
be changed?  If so, how?  How do you compare this with what you know about
existing computer-literacy courses?  Who should develop such a curriculum?
Who should pay for it?  Please respond directly to me.  Thanks.

	      *               *               *               *

Compared to current computer-literacy classes, the proposed course would spend
much less time reviewing the mechanics of operating machines, the syntax of
applications software or programming languages, and rote learning of lists,
from computer components to uses.  It would spend much more time considering
the capabilities and limitations of computers, through discussions of the
impacts of important computer applications.  This might be a standalone course
or a series of discussions interwoven into courses in, for instance, social
studies or history.

One specific example of material that could contribute to meaningful education
about computers is a multi-media presentation entitled "Reliability and
Risk: Computers and Nuclear War," produced and distributed by CPSR.  The
presentation explains how current political and military trends decrease
the time allowed for people to react to a crisis, thereby shifting critical
decision-making responsibilities to computers.  It attacks the myth of
computer infallibility by describing different types of computer errors, their
sources and consequences.  It explores the growing reliance on computerized
decision-making and how a computer error, especially in times of crisis, could
trigger an accidental nuclear war.  Lasting a half-hour, the program obviously
cannot cover the topic in great depth.  But it does present salient points
about an important and complex area of computer use, greatly heightens
people's awareness of problems that they are unlikely to learn about from
magazine articles about computers, and stimulates exciting discussions and
further thought.  The presentation uses no computers, and the intended
audience need no previous computer experience.

The proposed course might include discussions of
 *  SDI's computing requirements -- so students could consider the concept of
    software trusthworthiness and the potential for design errors in complex
    systems.
 *  The Vincennes episode -- so they could consider the difficulty of using a
    system outside the boundaries of its intended use.
 *  The FBI's National Crime Information Center (NCIC) -- so they could 
    consider the relationship between civil liberties and computer technology.
 *  The National Test Bed, war games -- so they could consider the limits of
    computer simulations.
 *  Computerized monitoring techniques -- so they could consider impacts of
    computers on the workplace.
 *  How computer science is funded -- so they could consider which sorts of
    problems society views as important.
 *  Some of the myriad of RISKS stories -- so they could consider the risks of
    depending on computer systems.

And so on.  Overall, the course would emphasize the importance of the social
and political context in which a computer system is developed and used.

------------------------------

Date: 1 Dec 88 20:45:26 GMT
From: jmc@ptsfa.PacBell.COM (Jerry Carlin)
Subject: Privacy versus honesty/equality
Organization: Pacific * Bell, San Ramon, CA

The following is from an article: "In Sweden, the public can read 
prime minister's mail" by Eva Janzon, Associated Press.

In Sweden, government records have been open to the public since 1766.
This includes the right to read the Prime Minister's mail (except for
a few classified items). Not only that, but everyone's records are
effectively public:

	"Knowing your neighbor's date of birth is enough to gain access
	to files at the National Taxation Board which lists income and
	tax from the previous year, church membership, marital status
	and current address.

	"If you take the number to the county police, you can find out
	about any unpaid bills.  Other registers list education, state of
	health and membership in associations.

	"All this has been accepted as a price for keeping people honest
	in a society that strives for equality."

The article did state that some Swedes dislike this invasion of privacy.

Is the risk of inequality and dishonesty more important than the risk 
to privacy?

Jerry Carlin (415) 823-2441 {bellcore,sun,ames,pyramid}!pacbell!jmc

------------------------------

Date:      Mon,  5 Dec 88 17:54:29 PST
From: "Clifford Johnson" <GA.CJJ@Forsythe.Stanford.EDU>
Subject: Computerized speeding tickets?

> Alas, as a Mass police officer pointed out in an interview, you have to catch
> someone *in the act* of speeding to get them for it.  Probably something to 
> do with that annoying bill of rights...

Not so in every state, I believe.  I recall a news story some 18 years ago in a
desert state (Arizona?), in which a cop called a another cop at another town to
look out for a certain car.  The defense argued that there was no way to know
to for sure that the speed limit was exceeded merely because the distance/time
in total exceeded the speed limit.  A university mathematician (measure
theorist) testified as to the meaning of the Mean Value Theorem, and the
speeding ticket was upheld by a presumably puzzled judge because no
counter-expert could be found to dispute him.

Does anyone know whether the Mass. "rule" is simply local?

   [And then there is the tale of the San Francisco police using computer
   records interactively to tow up-scale vehicles (on the grounds that their
   owners are more likely to pay up to get their cars back when towed).
   Yesterday they towed a car belonging to an undercover agent.  Referring
   to Ronni's item (particularly NCIC) in this issue, suppose that 
   information was in the computer that that car belonged to an undercover
   agent.  Then we have to assume that the agent was NOT ADEQUATELY under
   cover, especially if any further identification was included.  PGN]

------------------------------

Date: Tue, 6 Dec 88 09:38:49 EST
From: balcer@gypsy.siemens.com (Marc J Balcer)
Subject: Subways that "know" who's on board

From the Philadelphia Inquirer, Saturday, December 3:

SEPTA TURNSTILES TAKE A HIGH-TECH SPIN
by Mark Bowden, Inquirer Staff Writer

[...]  The new turnstiles still accept tokens, but they are also equipped with
magnetic scanners that enable passengers to let themselves into the station
just by sliding through their new, magnetically encoded weekly and monthly
passes.  The old turnstiles only accepted tokens.  [...]  Because each of the
turnstiles is connected to a central computer, and each card is encoded with a
serial number, use of the new turnstiles will help SEPTA compile far more
detailed records of how people use the transit system.

"If someone gets on the Elevated in the Northeast, uses the Broad Street Subway
at midday, and then commutes home at night on the Elevated, we will have an
exact record of all those trips," said [Robert E.] Wooten, SEPTA assistant
general manager for public affairs."  It will provide our operations planning
department with lots of detailed information about who gets on where, when, and
how often.

Marc J. Balcer    [balcer@gypsy.siemens.com]      (609) 734-6531
Siemens Research Center, 755 College Road East, Princeton, NJ 08540

------------------------------

Date: Tue, 6 Dec 88 00:19:16 PST
From: c60a-1bl@WEB.Berkeley.EDU
Subject: Automatic toll systems -- Dallas

	Regarding an earlier discussion of automatic toll systems:

	This evening (~11:45pm PST) on CNN, I caught the tail end of a
report on an automated toll-collection system being tested in Dallas.
The device consists of (and I quote) "chips and diodes and capacitors
on a board", and is apparently queried at each toll station. During a
brief statement, the president(?) of AMTECH, Inc. discussed plans
for the use of this system in many cities and in the rail network.

	Anyone have comments or more information?
	(I wish I had seen the beginning of the report...)

Andrew R. MacBride     c60a-1bl@widow.berkeley.edu (128.32.185.4)

------------------------------

Date:  Tue, 6 Dec 88 09:42 EST
From: "Maj. Doug Hardie" <Hardie@DOCKMASTER.ARPA>
Subject: "Hackers", "crackers", "snackers", and ethics (RISKS-7.86)

> Moreover, in more mature scientific fields, such as medicine, it
> is not left up to the experimenter to decide for himself what is
> ethically acceptable; he or she must convince review boards that
> include both peers and (one hopes) members of the affected public.

The cost of medical research is significant.  It is not within the resources of
your average high school student.  The cost of hacking "computer research" is
very low.  I seriously doubt that any kind of review system could be set up
that would be able to cope with the volume of this problem.  Even if you could
set it up, it would be a bureaucracy unto itself.

Also, I point out that the term hacker was in common use when I was in college
(64-69) to refer to a person who did not have any real understanding of what
they were doing, but just banged away with anything in a random pattern hoping
that something would work.  Calling an engineering student a hacker was the
ultimate put down.

------------------------------

Date: Tue, 06 Dec 88 11:38:21 EST
From: Joe Morris (jcmorris@mitre.arpa) <jcmorris@mitre.arpa>
Subject: `hacker' is already a dictionary entry

In RISKS-7.87, Frank Maginnis observes:
>                                    "Hacker" and "virus" will undoubtedly
>appear very soon in standard English dictionaries with the general public's
>understanding of the terms, not the profession's -- "hacker" probably
>already has! We'll just have to adapt.

I can't speak for 'virus', but 'hacker' is already there.  From the 1986 
edition of _Webster's_New_World_Dictionary_ (Prentice-Hall) comes the
following entry:

  hack.er n. 1. a person who hacks (see hack(1)) 2. an unskilled golfer,
  tennis player, etc. 3. a talented amateur user of computers, specif. one 
  who attempts to gain unauthorized access to files in various systems

The dictionary doesn't have the verb _hack_ defined in a computer sense, but
that may be waiting on the next edition.

Can anyone point to the first use of the term?  I remember using it in 1962
(and have comments in programs to prove it) but it seemed to be well-used
by then.  

------------------------------

Date: Tue, 6 Dec 88 13:51:24 CST
From: Douglas Jones <jones@herky.cs.uiowa.edu>
Subject: Re: "Hackers,"  "crackers,"  "snackers,"  and ethics

In P. G. Neumann's note of Mon, 5 Dec 88 10:27:35 PST, he points out that
we have to do something about whistle-blowing, and then gets back to questions
of hacking being dangerous, especially when we have flawed systems.  These
two statements bring to mind a sensible buisness practice of the early 1970's
that I have not seen used recently.

Back in the summer of 1972, I worked for Com-Share Incorporated, one of two
firms to commercialize the Berkeley Timesharing System.  Back then, I had not
yet heard the term "hacker", but we certainly knew that there were such people.
Com-Share had two interesting policies with regard to such people:

  1) All Com-Share employees were encouraged to use Com-Share facilities
      for personal use during off-hours, and the majority of personal use
      was assumed to be of a sort we would now call hacking.

  2) Com-Share had a standing reward of $500 for anyone who could expose
      a flaw in their system security, and while I was there, they raised
      the reward to $1000.

In concert, these policies encouraged hacking, but they made it into a
constructive activity.  An occasionally cited aspect of the "hacker ethic"
is that when hackers find something wrong with a system, they should report
the problem.  The problem is that reporting a problem might lead to its being
fixed, which in turn, might deny future access to the hacker.  A reward
can overcome this negative aspect of reporting bugs.

When I worked on the PLATO system at Illinois in the mid '70s, the system
administrators viewed the large community of PLATO hackers (mostly writing
and playing games, but with occasional password security attacks of the kmem
variety) as useful because they would exercise new system features long
before "legitimate" users would find them, and because they provided a heavy
system load before there was much of a legitimate user community.  As the
legitimate community grew and the excess capacity of the system diminished,
game playing and other "hacking" activities were severely curtailed, but
never eliminated.

In recent years, most computer crimes legislation I have seen has made
almost anything resembling hacking into a crime, and many system
administrators no-longer appear interested in the benefits that
a carefully managed hacker community can provide.  A hacker who finds
a flaw in a system and reports it is viewed as being a criminal with
a conscience instead of a benefit to society.

In a way, hackers who report flaws that they find in a system are like
whistleblowers, and this recent legal and managerial trend is quite
analogous to the "shoot-the-messenger" approach that is commonly
applied to whistleblowers.

------------------------------

Date:  6 Dec 1988 1258-PST (Tuesday)
From: Jeff Makey <Makey@LOGICON.ARPA>
Subject: Re: /dev/*mem and superuser

In RISKS 7.87, Paul E. McKenney <mckenney@spam.istc.sri.com> described
how to protect /dev/*mem on UNIX systems from uncontrolled read
access.  Unfortunately, he made a small mistake.  /bin/ps and other
programs that need access to /dev/mem should have their modes set to
2755.  Use of mode 4755 (as Paul suggested) sets the setuid bit rather
than the setgid bit.  Since /dev/mem is owned by root and Paul also
suggested changing the owner of /bin/ps to bin, there is probably no
security problem in his fix, but ps won't work.

I have done this on my 4.2 BSD system with no apparent ill effects.
In addition to /bin/ps, /usr/ucb/w needs this treatment.

Once again, we encounter a risk in (blindly) applying untested
bugfixes.  This comment, of course, applies to my own suggestions in
the paragraphs above.

------------------------------

End of RISKS-FORUM Digest 7.89
************************
-------