[comp.risks] RISKS DIGEST 10.50

risks@CSL.SRI.COM (RISKS Forum) (10/16/90)

RISKS-LIST: RISKS-FORUM Digest  Monday 15 October 1990  Volume 10 : Issue 50

        FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS 
   ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Contents:
  Hackers blackmail UK five banks (Pete Mellor)
  Equinox on A320 (Robert Dorsett)
  Re: A320s and Northwest Airlines (Chris Davis)
  Re: Ada MultiTasking (Chet Laughlin)
  Re: Expert system in the loop (Randall Davis)
  Announcement of CPSR annual meeting (Lesley Kalmin)

The RISKS Forum is moderated.  Contributions should be relevant, sound, in good
taste, objective, coherent, concise, and nonrepetitious.  Diversity is welcome.
CONTRIBUTIONS to RISKS@CSL.SRI.COM, with relevant, substantive "Subject:" line
(otherwise they may be ignored).  REQUESTS to RISKS-Request@CSL.SRI.COM.
TO FTP VOL i ISSUE j:  ftp CRVAX.sri.com<CR>login anonymous<CR>AnyNonNullPW<CR>
CD RISKS:<CR>GET RISKS-i.j<CR>; j is TWO digits.  Vol summaries in 
risks-i.00 (j=0); "dir risks-*.*<CR>" gives directory; bye logs out.
ALL CONTRIBUTIONS ARE CONSIDERED AS PERSONAL COMMENTS; USUAL DISCLAIMERS APPLY.
The most relevant contributions may appear in the RISKS section of regular
issues of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state otherwise.

----------------------------------------------------------------------

Date: Mon, 15 Oct 90 17:25:56 PDT
From: Pete Mellor <pm@cs.city.ac.uk>
Subject: Hackers blackmail five banks (UK)

Excerpts from The Independent on Sunday, 14 Oct 1990:

Headline: "Hackers blackmail five banks"

Subhead: "Mysterious computer experts demand money to reveal how they 
          penetrated sophisticated security"

By-line: Richard Thomson

 "At least four British clearing banks and one merchant bank in the City are
  being blackmailed by a mysterious group of computer hackers who have broken
  into their central computer systems over the last six months. These breaches
  of computer security may be the largest and most sophisticated yet among
  British banks.

  The electronic break-ins, which began last May, could cause chaos for the
  banks involved. Once inside their systems, the hackers could steal 
  information or indulge in sabotage, such as planting false data or damaging
  complex computer programs. It is unlikely, however, that they would be able
  to steal money.

  So far, the hackers have contented themselves with demanding substantial sums
  of money in return for showing the banks how their systems were penetrated.
  None of the banks has yet paid.

  [Stuff omitted]

  One computer expert described their level of expertise as "truly frightening".
  They are not believed to have links with organised crime, which has become
  heavily involved in computer hacking in the US over the last two to three 
  years. [Any comments?? - PM]

  It is a severe embarrasment for the banking community which is frightened
  that public awareness of the security breach could undermine public 
  confidence. As a result, they have not called in the police but some have 
  hired a firm of private investigators, Network Security Management, which
  is owned by Hambros Bank and specialises in computer fraud. It is common
  for banks not to report fraud and security failures to the police for fear 
  of damaging publicity.

  All the banks approached either denied that they were victims of the 
  blackmail attempt or refused to comment.

  The hunt for the hackers is being led by David Price, managing director of 
  NSM, who confirmed his firm was investigating security breaches at five
  British banks. "I am confident of success in catching the hackers." he said.

  [Stuff omitted]

  Security measures were tightened after a large computer fraud at a leading
  City bank three years ago. Although the bank involved was never named, it
  is understood the money was never recovered. [Anyone got the details?? - PM]

  [Stuff omitted]

  According to an expert, who recently advised one of the big four clearers
  on its computer systems, there are few people who understand the bank's
  system well enough even to detect a break-in.

  [Stuff omitted]

  According to some reputable UK and US estimates, up to 5 per cent of the
  gross national product of western economies disappears in fraud. Experts say 
  that the senior managers of many companies simply do not appreciate the 
  need for tight security.

  [Stuff about the Computer Misuse Act omitted]"

                   ---- End of extract ----

Just how "sophisticated" banks' computer security is can be judged from a
conversation I had last Saturday night in the pub with an acquaintance who
manages the local branch of a chain of off-licences (liquor stores).

He had just finished entering his orders onto his PC, which communicates
remotely with the firm's main warehouse in Dartmouth (I think). He told me that
he entered the normal 5-digit code to send in his completed order, and was
amazed to find displayed on his screen the credit card transaction records
from Barclays' Bank in South Yorkshire, with full details: names, account
numbers and amounts.

Feeling thoroughly confused, he switched off the machine and went to bed.
When he checked the next day, he found that his order *had* been correctly
received.

Obviously just a one-off incident that need not affect public confidence!

Peter Mellor, Centre for Software Reliability, City University, Northampton 
Sq.,London EC1V 0HB +44(0)71-253-4399 Ext. 4162/3/1 p.mellor@uk.ac.city (JANET)

   [Also reported by  Sanford Sherizen <0003965782@mcimail.com>]

------------------------------

Date: Fri, 12 Oct 90 23:59:54 CDT
From: rdd@rascal.ics.utexas.edu (Robert Dorsett)
Subject: Equinox on A320

>>The programme went on to consider the crash of the A320 at Bangalore. A pilot
>>was interviewed saying that it was virtually unknown for an aircraft to lose
>>height in such a way in clear conditions on a landing approach.
>
>We know that the Bangalore crash _was_ pilot error. Both the `back box'
>and the cockpit voice recorder indicate that the pilots were to blame.
>Flight International has given a good account of this

On the other hand, Flight International has been extremely close to Airbus
throughout the development of the aircraft.  While I like the magazine, it is
also a proponent of Euro-oriented industry, and has been very careful not to
say anything too damaging about the airplane--and has certainly not given 
detailed consideration to the voluminous controversial issues which surround 
many aspects of the aircraft.  

At the risk of sounding like a broken record, I suggest the following:
"pilot error" is an unacceptable answer.  In clear, stable conditions, with
an (apparently) operational airplane, one just doesn't go around crashing 
airplanes.  "Pilot error" might be acceptable if, say, one reverses a holding 
pattern and flies into a mountain in clouds, but Bangalore (and Habsheim) 
smacks of a systemic error of some sort.  What could it be?  Let's see:
   - the airlines' hiring and qualification mechanisms (ab initio).
   - the training mechanism (computer-*based* training, supplied by Airbus)
   - the overall *philosophy* of the flight deck design (Airbus)
   - individual components of flight deck design (altimeter design, etc)
   - support system problems (FADEC unresponsiveness, ignoring commands
     which put the airplane out of the computed "safe" envelope)

The emphasis on RISKS has long been on the last category: concerns on
hardware and software failure, common sources of failure, etc. 

>The captain left the aircraft in idle descent mode and
>flew into the ground. The aircraft warned the pilots (both visually and
>aurally), but they ignored the warnings. Equinox chose not to report
>this (the rest of the programme seemed very convincing).

This supports the systemic view.  In my experience on an A320 simulator
(reported on sci.aeronautics about five weeks ago), I noted that there were way
too many alerts.  There are two types: warnings and cautions.  They both have
the same chime, but illuminate different lights.  They often deal with uttelry
trivial situations, but require the pilot to drop what he's doing and sort
through the ECAM displays to figure out whether to spend any MORE time on it.
In many cases, the computer's already taken care of cautions, so, "why worry?"

Apart from too many alerts, ground-proximity warning systems have a poor
reputation in the airline industry as a whole: false warnings at cruising
altitude, warnings during properly-conducted approaches, etc.  These have been
with us for nearly 20 years; crew reluctance to pay attention to them have
resulted in several other airliner crashes (although it's undeniable that the
systems have also saved lives).

Lastly, there has *always* been a tendency in the airline industry to make
unworkable or poor designs work (e.g., Comet, DC-10).  Given a poorly designed
cockpit, the tendency is to attempt to train around any defects.  Ditto with
a bad airplane.  This *suppresses* the consequences of systemic error, but by
no means *eliminates* it.  When "pilot error" happens in this environment, 
all too often the operator is castigated, while the circumstances which 
produce the error is ignored.


I suggest (again) that the way the airplane interacts with the pilot is at
LEAST as important as component-wise reliability.  Just because the machine
works does not mean that the machine's *design* is satisfactory for human
operation.  This is a consideration that will become increasingly important
with all aspects of automation, and needs to be addressed in this forum.  This
has nothing to do with lawsuits, sensationalism, or PR-types.  It has to do
with saving lives, and preserving the capability of the human component of
safety-critical systems to do its role properly.

End of diatribe.

Robert Dorsett     UUCP: ...cs.utexas.edu!rascal.ics.utexas.edu!rdd  

------------------------------

Date: Sat, 13 Oct 90 11:07:01 -0400
From: ckd@cs.bu.edu
Subject: Re: A320s and Northwest Airlines (Epstein/Spaf, RISKS-10.49)

spaf> Gene Spafford <spaf@cs.purdue.edu> said:
spaf> A few months ago, I told a friend about the various stories I
spaf> had read here and elsewhere about the A320.  The subject came up
spaf> when I explained why I would never again fly Northwest Airlines
spaf> (they bought a bunch of A320s for domestic use).

I've seen a few of them; never flown on 'em, though (hey, *I* read RISKS!).  My
policy still is to always check the aircraft type when making reservations.

My big "Northwest A320 story" is of a time I was flying from Seattle to Boston,
with a stop in Detroit.  There was a DC-10 at Detroit with a gate hold for
maintenance (hydraulics problems, I believe) and they swapped our aircraft for
that one (the flight to LAX already being two hours late, they figured they'd
spread the misery out a bit, instead).

Our flight was scheduled to leave DTW at about 9 pm; we eventually
left at around 11-11:30, arriving in Boston at about 1 in the morning.

The A320, originally scheduled to be the 7:30 flight from Detroit, was listed
on the monitors at Boston's Logan airport to be arriving at 3 am (meaning it
had not left the ground in DTW when we arrived in Boston).

The DC-10 had had its own problems, but they were (obviously) better
understood by the ground crews involved.

An issue of RISKS management: on my last flight through Detroit, I saw a brand
new ("two months old") 747-400 being loaded for a flight to Minneapolis (one of
Northwest's other hub cities).  After checking the schedule, it turns out that
this plane is currently being used *only* to shuttle between the two cities.

Anyone want to bet it's for ground-crew and maintenance crew
familiarization at two airports likely to see many more of the -400s?

--Chris
< Christopher Davis, BU SMG '90  <ckd@cs.bu.edu> <...!bu.edu!bu-cs!ckd> >

------------------------------

Date: 14 Oct 90 21:51:23 GMT
From: ctl8588@rigel.tamu.edu (LAUGHLIN, CHET)
Subject: Ada MultiTasking

In responce to Erling Kirstainsen's article about Ada's multitasking being
vaguely defined - my Real-Time systems class has had problems with exactly this
issue.  In general the class is a graduate level course and we had hoped to use
Ada on a network of IBM PS/2s for the labs.

The first lab involved two tasks running in parrellel.  In reality it was
figured that the tasks would time-slice on a single machine.  However, this was
not the case.  The compiler would simply run the highest priority task until it
ended, and then run the lower task.  It was interesting to note that programs
that ran correctly on SUNS did not run correctly on the PS/2s - even though
they compiled without change.

Now, one could blame the operating system - DOS is not anywhere close to a
multitasking system.  Or one could blame the language.  The compiler makes no
mention of the fact that tasks will not run concurrently in any of its
documentation - and so I'd lean toward placing blame there.  I suppose that if
Unix or OS/2 could be afforded and placed on the PCs the programs would compile
and work correctly.  We have also discussed in class how the specification for
Ada is open to interpretation on how tasks should be scheduled.

The end result is that the labs will be done in C on the PS/2s.

Chet Laughlin                  CTL8588@RIGEL.TAMU.EDU         

------------------------------

Date: Mon, 8 Oct 90 13:21:46 edt
From: davis@ai.mit.edu (Randall Davis)
Subject: Re: Expert system in the loop 

Two (last?) gasps:

1) As the previous discussion (two years ago) of this incident made clear,
another fundamental problem here is the tactical advantage of offense over
defense: the distance from which it's possible to shoot accurately is larger
than the distance at which it's possible to identify the source.  That may not
have been a crucial factor in this incident, but it contributes to the mindset
and practice that says self-defense means you may have to fire at a threat
before you're certain of its identity.  That's a consequence of all sorts of
technology, and it happens to the infantryman with a rifle because bullets can
fly further than we can easily see.


As for the title of this whole discussion -- "Expert systems in the loop":

2) There aren't any and there never were any.  As abundant discussion has made
clear (particularly the description by Matt Jaffe in 10.46), the Vincennes had
some interesting signal processing and data description hardware and software,
but nothing that can by any stretch deserve the term "expert system."  If
there's more software to the story than anyone has described thus far, it would
be interesting to hear about it from a knowledgable source.  We might also
consider this, from an early report about the system (from a story in Risks
9.70):

  "The anti-air warfare officer made no attempt to confirm the reports
  [from the crew] on his own," the commander-in-chief of the US Central
  Command reported.  "Quick reference to the console directly in front of
  him would have immediately shown increasing, not decreasing, altitude 
  [of the Iranian jet]."  Instead, this "experienced and highly qualified
  officer, despite all of his training, relied on the judgment of one or
  two second-class petty officers, buttressed by his own preconceived
  perception of the threat, and made an erroneous assessment to his
  commanding officer."

Note in particular the second sentence, indicating that the system displays
data about the aircraft, not threat interpretation.  As noted in earlier
discussions, this data (like the data on your speedometer) can of course be
incorrect, but that's a different issue.

So until otherwise informed, let's be clear about this: it was a problem of
"Instruments in the loop".  That by itself may be worth discussing, but it is
not and never was an expert system.  And it might be interesting to ask, Why
the rush to label it an expert system?

------------------------------

Date: Mon, 15 Oct 90 09:37:10 PDT
From: kalmin@atd.dec.com
Subject: Announcement of CPSR annual meeting

                             1990 Annual Meeting
                                     of
                Computer Professionals for Social Responsibility
                              October 20 and 21
                    Stanford University and Palo Alto, CA

Computer Professionals for Social Responsibility, the nation's only public
interest organization of computing professionals, will hold its 1990 Annual
Meeting at Stanford University and at Ming's Villa restaurant in Palo Alto on
October 20 and 21, 1990.

The CPSR Annual Meeting is a national meeting that gives computer
professionals from all over the country a chance to meet and discuss some of
the most important and interesting issues facing the profession and the
public.  This year's meeting will cover civil liberties and First Amendment
rights in computer communication; using computers to support democratic
oversight of government; women in the computing field; and what the public is
at Stanford University, will include the following:

        John Perry Barlow -- "Civilizing Cyberspace:  Computers, Civil
        Liberties and Freedom."  Barlow is the co-founder of the Elec-
        tronic Frontier Foundation, a lyricist with the Grateful Dead,
        and author of the article "Crime and Puzzlement" featured in
        the latest issue of The Whole Earth Review.

        David Burnham -- "Turning the Tables:  Computer Oversight for
        Citizens."  Burnham is a former investigative reporter for the
        New York Times, and the author of the books The Rise of the
        Computer State and A Law Unto Itself, the latter an expose of
        the IRS.  While at the Times, Burnham was responsible for the
        stories that led to the Knapp Commission on police corruption
        in New York City, and he was the reporter who broke the Karen
        Silkwood story.  He now works with the Transactional Records
        Clearinghouse at Syracuse University.  TRAC uses the Freedom of
        Information Act and computer analysis to provide oversight of
        powerful Federal agencies such as the IRS, the Nuclear Regula-
        tory Commission, and the Department of Justice.

There will be two panel discussions the afternoon of Saturday, October 20:

        "Women in Computing: Where We Are, Where We Want To Be, and
        How To Get There."

        Panelists:
                Shari Lawrence Pfleeger, Chair, ACM Committee on
                the Status of Women and Minorities

                Donna Lehnoff, Women's Legal Defense Fund

                Sheila Humphreys, Department of Electrical Engi-
                neering and Computer Science, UC Berkeley

                Barbara Simons, Secretary, Association for Comp-
                uting Machinery (ACM)

        Panel moderated by Anita Borg, DEC Western Research Lab

        "The Media and 'Mythinformation':  What and How Does the Public 
        Learn About Computers?"

        Panelists:

                Bob Abel, multi-media expert and television 
                commercial producer, Synapse Technologies

                Michael Rogers, general editor and technology
                editor, Newsweek magazine

                Rudy Rucker, physicist and science fiction 
                author

                Rob Swigert, professor of creative writing, San
                Jose State, science fiction author, and author
                of Portal, interactive fiction

        Panel moderated by Paul Saffo, Institute for the Future

The Saturday program begins at 9 a.m., and a continental breakfast will be
served just prior to the meeting.  There will be a lunch break from noon to 2
p.m., and the meeting is scheduled to end at 5:30.

The Sunday, October 21, portion of the two-day meeting will be dedicated to
discussions about CPSR as an organization, and there will be workshops on
computers and education, the environment, civil liberties and privacy, peace
and war, and computers in the workplace.

Admission to the CPSR Annual Meeting is $35 for members, $45 for non-members
until October 14.  After October 14 prices go up $10 for each category.
Non-members can join CPSR for one year for $40 and pay the member price to
the meeting.  Admission to the banquet is $50 per person, the same price for
members and non-members.

In addition, for $100 more people can attend a fundraising reception for CPSR
at the offices of Regis McKenna, Inc., on Saturday evening from 6 to 8 p.m.
This is a chance to meet the speakers, leaders of CPSR, and many people from
the computer industry of Silicon Valley.  Contributions to CPSR are
tax-deductible.

For more information and registration materials, contact CPSR at (415) 322-3778
or by electronic mail at cpsr-staff@csli.stanford.edu.

------------------------------

End of RISKS-FORUM Digest 10.50
************************