[comp.risks] RISKS DIGEST 5.33

RISKS@CSL.SRI.COM (RISKS FORUM, Peter G. Neumann -- Coordinator) (09/05/87)

RISKS-LIST: RISKS-FORUM Digest  Friday, 4 September 1987  Volume 5 : Issue 33

        FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS 
   ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

Contents:
  How to Beat the Spanish telephone system (Lindsay F. Marshall)
  Re: Automated control stability and sabotage (Amos Shapir)
  Crisis in the Service Bay (Mark Brader)
  Who is responsible for safety? (Nancy Leveson)
  Certification of Software Engineers 
    (Brian Tompsett, Richard Neitzel, Wilson H. Bent)
  Irish Tax Swindle (John Murray)
  Pogo Wins a Free Lunch -- Costs and Liability in Good Systems (Hal Guthery)
  Re: Bank Computers and flagging (Bill Fisher)

The RISKS Forum is moderated.  Contributions should be relevant, sound, in good
taste, objective, coherent, concise, nonrepetitious.  Diversity is welcome. 
Contributions to RISKS@CSL.SRI.COM, Requests to RISKS-Request@CSL.SRI.COM.
FTP back issues Vol i Issue j from F4.CSL.SRI.COM:<RISKS>RISKS-i.j.  
Volume summaries for each i in max j: (i,j) = (1,46),(2,57),(3,92),(4,97).

----------------------------------------------------------------------

From: "Lindsay F. Marshall" <lindsay%kelpie.newcastle.ac.uk@Cs.Ucl.AC.UK>
To: risks@csl.sri.com
Date: Tue, 1 Sep 87 11:35:00 BST
Subject: How to Beat the Spanish telephone system
(Really-From Rosa Michaelson, courtesy of LFM)

SPANIARDS LEARN THE PRICE (ART) OF PHONE PIRACY (From the Independent)

Madrid -- The ears of officials at the Spanish telephone company Telefonica,
are burning at an article printed in a Dutch newspaper that ran instructions
on how anyone with a computer hooked up to their telephone can dial around
the world at Telefonica's expense.

  Such telephone piracy has plagued other countries, such as the United
States, Britain and Germany who were forced to end this electronic joy-riding.

  But according to the Dutch Newspaper, Volksrant, Spain, Portugal and
Italy all lack the means of tracking down pirates who ride the 'bridges'
-- those who ring anywhere in the world, and talk as long as they want,
courtesy of the intermediate or 'bridge' method. 

  The method is like an electronic shell game, whereby the pirate calls a
busy number in Spain through his local operator.  Then using that open line
to Spain, the pirate coaxes(!@??) the right series of sonic signals through
his computer to ring elsewhere at the telephone company's cost.

  A Telefonica spokesman said that the number of callers hitching free
rides on the Spanish system is 'infinitesimal'.  But he said most
European telephone companies had a gentleman's (?sexist) agreement to
split the costs incurred by the pirates. 
                                                     rosa

------------------------------

To: nsc!comp-risks@Sun.COM
From: nsc!nsta!nsta.UUCP!amos@Sun.COM (Amos Shapir)
Newsgroups: comp.risks
Subject: Re: Automated control stability and sabotage
Date: 22 Aug 87 19:33:35 GMT

      [... about affecting the values of shares by sabotaging computers]

Actually something like this was attempted here lately - someone called
a broker, posed as a customer and placed huge purchasing order for a
certain share; luckily the purchasing bank's computer flagged the order
as suspicious and it was never carried out. This time the fraud was done
by a human, and the computer caught it. I guess Wall Street computers
may be protected much better.

Amos Shapir, National Semiconductor (Israel)
6 Maskit st. P.O.B. 3007, Herzlia 46104, Israel  Tel. +972 52 522261
amos%nsta@nsc.com (soon to be amos%taux01@nsc.com) 34 48 E / 32 10 N

------------------------------

Date: Tue, 1 Sep 87 12:33:43 EDT
From: msb@sq.com (Mark Brader)
To: risks@csl.sri.com
Subject: Crisis in the Service Bay (condensed from Toronto Star)

The evolution of the species -- from grease monkey to service technician
-- is behind schedule.  Our education, apprenticeship, and retraining
system is not keeping pace with the spread of electronics and new high-tech
components through cars, say service experts.  They point to a crisis ...

Improvements in quality and defect avoidance have spared most new car
owners the worst effects of a repair industry that's technically dated
and starved for able new recruits.  But watch out when today's cars and
those of tomorrow start wearing out.

Already some consumers are enduring multiple visits to dealerships to cure
such aggravations as stalling, quitting, rough idling, hesitation, and
odd noises, not to mention more serious problems. ...

Each year, literally millions of dollars worth of computer controls are
being removed from cars unnecessarily when mechanics can't trace the true
source of problems. ...

Today's average mechanic in Canada is in his [sic] mid-30's.  He completed
his training more than a decade ago, before computer controlled emission
and engine management systems, anti-lock brakes, or widespread use of
turbochargers.  Add to that the coming electronically controlled four-wheel
driving and four-wheel steering systems.

"There is nothing in the auto mechanic program that says you have to go back
to school for upgrading", says [Ford of Canada's William] Rowley.  "Most
fellows have been out of school so long, they don't know how to learn any
longer."

Rowley says there's also a problem with the compensation system for mechanics.
A fellow can often earn more doing routine service tasks, like brake repairs,
than he can earn puzzling over a mysterious electronic problem. ...

Rowley believes every new-car dealership should have one electronics
specialist by 1990 and half of new-car mechanics should have these skills
by 1995. ...

James Lanthier, ... [of] Ontario's Ministry of Labour, says 598 Ontario
mechanics are now spending weekends completing three trade updating courses.
None has yet finished more than the first course, which began in January,
on the fundamentals of computerized vehicle management systems.  And these
students are only a minority of the 15,000 mechanics in Ontario involved
directly in repairing cars. ...

Ford's Rowley complains ... that community [i.e. vocational] colleges
are still graduating apprentice mechanics who have not been exposed to
some of the latest technical features of cars.  He says most college
curriculums are three years out of date. ...

Only recently has the auto industry persuaded the federal government to
take a leadership role in trying to help provincial education authorities
set a course of action. ... If any provinces took immediate action ...
the first of a new crop of high-tech auto technicians would be collecting
their licenses some time in late 1993.  And even that may be too optimistic.

Condensed by Mark Brader from a column by James Daw that appeared in the
Toronto Star on August 29, 1987.

------------------------------

To: risks%csl.sri.com@ROME.UCI.EDU
Subject: Who is responsible for safety?
Reply-To: nancy@ics.UCI.EDU
Date: Fri, 21 Aug 87 13:27:29 -0700
From: Nancy Leveson <nancy%murphy.uci.edu@ROME.UCI.EDU>

>From Frank Houston (Risks 5.31):
>The point that I am striving for is that assigning to SOMEBODY the
>responsibility for safety, quality or security, ...

While it is obvious that safety or security or any other important software
quality must be the concern of everybody or it will not be achieved, that
does not mean that assigning responsibility is not necessary nor that
everybody (en masse) can perform the necessary activities to achieve it.
One of the main reasons for the growth of the system safety engineering
field is that the early missile systems were so unsafe.  The basic procedure
for achieving safety in these early systems was to assign responsibility for
safety to everybody.  Unfortunately, there are always conflicting goals in
any engineering design process.  These goals need to be prioritized and
decisions made with respect to these priorities and with knowledge about how
the decisions will affect the ultimate quality goals of the entire system
under construction.  The responsibility for setting goals and priorities
rests with management.  But to then assign responsibility to every engineer
and designer to make all decisions about conflicting goals means that each
person must understand all the implications of every decision they make on
every part of the larger system under construction.  This is unrealistic
because for any reasonably complex system, each person cannot possibly have
all the information necessary to make these decisions.

The individual engineer is responsible for implementing the safety 
requirements for the component or subsystem under his control.  But there 
needs to be a systems safety engineer who is responsible for deriving those 
individual subsystem safety requirements from the SYSTEM safety requirements, 
ensuring that the design personnel are aware of them (so that they can 
implement them), providing the interface point for questions that arise as the 
design progresses (requirements on any real project are not completely 
specified before design begins and never changed thereafter -- for one thing, 
the design process itself suggests additional requirements), etc.  My personal 
belief after studying complex and large projects is that there is a necessity 
for a software safety engineer to interface with the system safety engineer.  
The reason for the extra level of interface is the complexity of the software
subsystem and the practical problems of training a person to be an expert in 
all aspects of engineering and computer science.  The software safety engineer 
is responsible for performing the duties listed above for the system safety 
engineer but with respect to the software subsystem and for interfacing with 
the system safety engineering group which is dealing with the larger questions 
of the interfacing between the subsystems.  This does NOT imply that each 
person is not responsible for the quality of the subsystem on which they are 
working, only that they cannot possibly be responsible for understanding all 
aspects of the entire system and that decisions about such things as safety 
and security cannot be made individually and in a vacuum by hundreds of people 
without any central coordination or planning function.

So yes, safety must be designed into the system by each individual on the
project.  But somebody must provide them with the information necessary to
do that.  And that person (or group) has the responsibility for the
correctness of that information and for getting that information to the
people who need it in a form and at a level of detail that can be most 
effectively used by those people.

Nancy Leveson, University of California, Irvine

------------------------------

Date: 24 Aug 87 11:38:21 BST
From: Brian Tompsett <mcvax!ecsvax.ed.ac.uk!BCT@seismo.CSS.GOV>
Subject: Certification of Software Engineers
To: "risks%csl.sri.com@ukc" <csl.sri!risks@ukc.ac.uk>

  Just a brief contribution to the debate on the Certification of Software
Engineers. In the UK Software Engineers may apply to be recognised as a
"Chartered Engineer", by the Engineering Council. This gives the software
engineer the same status/rights as a chartered civil or nuclear engineer.

  The applicants qualifications and experience are examined by an admitting
body to ensure that he meets the required standard. The British Computer
Society is in the process of nominating members to Chartered Engineer Status.
Membership of the BCS requires sufficient academic or relevent commercial
experience and the member has to adhere to a code of professional conduct.

  In the UK, therefore, there is a growing body of certified and experienced
Computer Professionals recognised by their peers, who are qualified to
undertake work to the standards required.
                                               Brian Tompsett, MBCS.

Brian Tompsett. Department of Computer Science, University of Edinburgh,
JCMB, The King's Buildings, Mayfield Road, EDINBURGH, EH9 3JZ, Scotland, U.K.
Telephone:         +44 31 667 1081 x3332.
JANET:  bct@uk.ac.ed.ecsvax  ARPA: bct%ecsvax.ed.ac.uk@cs.ucl.ac.uk
USENET: bct@ecsvax.ed.ac.uk  UUCP: ...!seismo!mcvax!ukc!ecsvax.ed.ac.uk!bct
BITNET: psuvax1!ecsvax.ed.ac.uk!bct or bct%ecsvax.ed.ac.uk@earn.rl.ac.uk

------------------------------

Date: Wed, 26 Aug 87 19:33:54 mdt
From: udenva!rneitzel@seismo.CSS.GOV (Richard Neitzel)
Date: Wed, 26 Aug 87 19:14:03 mdt
To: RISKS@CSL.SRI.COM
Subject: Certification (Re: RISKS 5.28)
Organization: U of Denver

A Firm NO to certification. Richard Neitzel, Rockwell International, Golden, CO

In her recent article Nancy Leveson states:
  >                 ...Am I wrong in my observation that under-qualified 
  >people are sometimes doing critical jobs and making important decisions?  

	No, you are not wrong in your observation, as many under-qualified
people are working out there. However, I seriously doubt that any system of
proscribed training, education, experience, etc. will be either effective or
of use to much of the computer industry. I will base this on three
observations.

	First, there is the implicit assumption that engineers are
particularly intellegent and able persons. Having worked with engineers from
all sorts of backgrounds, and being one myself, I honestly feel that the
number of under-qualified, and in some cases, out right ignorant, engineers
is staggering. This applies equally to persons with years of "experience" as
to recent graduates. I have worked with "recognized experts" who obviously
knew little more about their subject then I.

	Secondly, and closely related to the first item, is the assumption
that by using tests, experience criteria, etc. we can determine whom are
qualified persons. As a former certified nondestructive testing Level II, I
can testify from personal experience that such programs have severe
problems. For those not familar, nondestructive testing is a highly
specialized field that invokes the inspection of items for defects using
such methods as ultrasonics, radiography, and eddy currents. Since one
missed defect could cause a disaster, such as an airline crash, the need for
qualified persons is obvious. Under the agency of the American Society for
Nondestructive Testing, a program of training, testing and certification has
been in use for over 20 years. Unfortunately, it has not proved adequate to
the task. First, given human nature, fraud has been a part of the system. By
requiring certification, the number of available people is cut sharply. This
means that the incentive for both employer and employee is great to falsify
certifications. Second, the program has proved to have only a small bearing
on the quality of the persons who are certified. Recent studies have shown
that only 60-70 percent of the persons certified are performing at an
adequate level and out of those most are only performing their jobs at the
70th percent level.

	Third, who will determine what the standards are to be? Is it truly
possible for a committee to determine what is suitable for a board spectrum
of industries? Drawing again on my NDT experiences, it has been discovered
that persons who are excellent in testing nuclear power plants have no luck
in aircraft inspection and vice versa. The blanket certification fails
because it must be too general, in order to cover the entire field. The
result has been that industries are requiring additional testing, etc. for
their own areas.  Once this starts to occur the attempt to quanitify
someones ability in an area outside of the one in which they are currently
working is impossible.  Do we really want to limit job mobility this way?

	For these reasons I am against attempting to overly regulate who may
call himself a "software engineer".  I should like to close with two parting 
thoughts:

	1> Some of the most creative and effective programmers I have ever 
worked with had no formal CS training (one dropped out of college after 2 years
and the other is a philosophy major). Conversely, one of the worst was a bona
fide CS graduate with 2 years experience, whose code was simply a wooden copy
of the "proper" way to program.

	2> Medical doctors, who frequently make life or death decisions, are
required to pass their board exams only once (the same is true of lawyers). 
Thereafter, they are judged only by the quality of their work. Isn't that 
exactly what we are doing now?

	The opinions expressed are mine alone and do not reflect any position
on this issue by Rockwell.

------------------------------

Date: Wed, 2 Sep 87 15:03:17 EDT
From: allegra!vax135!whb@EDDIE.MIT.EDU (Wilson H. Bent)
To: RISKS@csl.sri.com
Subject: Re: Regarding the certification of software engineers

>From:  <benson%cs1.wsu.edu@RELAY.CS.NET> (David B. Benson)
> I conclude that it will take at least one major accident with significant
> loss of life, this accident attributable to software failure, before there
> will be sufficient interest to establish a certification program.

But this is the trouble with software certification, with Star Wars
testability, with software copyright protection, with Look-and-Feel
lawsuits, etc, etc:  the slipperiness of software.

For each of the cases presented in RISKS recently (ATMs and other bank
errors, air and rail traffic control, power station control), I can see a
long and drawn-out legal battle over accountability and responsibility.
Who, for example, would be held responsible for improperly programming the
flap controllers and monitors responsible for the recent Detroit air crash?
Assuming that one person or group were found responsible, would they be
personally liable, or the company?  Consider the Morton-Thiokol hearings: a
fairly clear case of a bad product, and a lot of finger-pointing.  Where did
the blame finally rest?

My point is: Yes, if there were such an accident which was clearly the
result of software failure, it *might* lead to a certification
program.  Unfortunately, I don't envision such an accident, or rather
such an assignment of guilt.

Wilson H. Bent, Jr.		... ihnp4!hoh-2!whb
AT&T - Bell Laboratories	(201) 949-1277
Disclaimer: My company has not authorized me to issue a disclaimer.

------------------------------

Date: Fri 28 Aug 87 19:04:14-PDT
From: J.JXM@HAMLET.STANFORD.EDU  <John Murray>
Subject: Irish Tax Swindle

The Aug 14 1987 issue of "The Phoenix", an Irish investigative magazine,
tells of a tax swindle currently being investigated there. The taxation
authorities are the Revenue Commissioners (abbrev. Rev Comm) and the
Collector General (abbrev. Coll Gen). Checks paying tax bills and using the
abbreviations as payee were misappropriated by crooked tax officials, who
altered the payees to "Trevor Commerford" or "Collette Gerald". Accounts in
these bogus names were opened all over Dublin.  Flaws in the offices'
(manual) procedures allowed the crooked officials to effectively 'lose' the
defaulter's files in the bureaucratic system.  The defaulter is typically
happy never to hear from the tax office again, even though s/he may only
have paid a portion of the bill.
 
  "A crucial factor in the whole fiddle is the modern banking 
  practice of retaining checks and noting only the amount debited
  on customers' accounts. In days gone by, alterations to the 
  payee line would have been immediately obvious to the payer 
  when the check made its way back with the routine statement." 

While writing this, I've thought of two other stories related to Irish 
banks. One involved using dud checks to earn interest during a prolonged
bank officers' strike. The other concerned a microcode fix I applied to 
the Bank of Ireland system which stopped it dead overnight, with an 
apparent "loss" of several million pounds. 
                                                  John Murray 

------------------------------

Date: Thu, 27 Aug 87 08:20 EDT
From: "guthery%asc@sdr.slb.com" <GUTHERY%ASC%sdr.slb.com@RELAY.CS.NET>
Subject: Pogo Wins a Free Lunch -- Costs and Liability in Good Systems
To: risks@csl.sri.com

Does building a high quality, safe, reliable, and secure instance of 
a system cost more than building a low quality, unsafe, unreliable, and
insecure instance of the same system?  If not, then the safe one, being
of equal cost, will surely out sell the unsafe one and the folks who
build the safe one will prosper and those who build the unsafe one will
not.  Notwithstanding catchy phrases, generally speaking, quality, safety, 
reliability, and security are not free.  Who then absorbs their cost?

Frank Houston proposes that the builders themselves absorb the additional
cost when he suggests that "quality, safety and security are everybody's
jobs."  As the Japanese and Korean experiences have taught us, this works.
The Japanese auto worker does more work for a given amount of money than
an American worker.  Japanese cars cost the same as American cars, you
get more for your money with Japanese cars, and the builders of Japanese
cars prosper and the builders of American cars do not.

Will this approach work with software and computer systems?  Enterprise A
pays 10 engineers $50,000/year each and builds Product A.  Enterprise B 
pays 15 engineers $33,333/year each and builds Product B which costs the
same as Product A but is of higher quality, safer, more reliable, and more 
secure.  My expectation is that there will be a net flow of engineers from 
Enterprise B to Enterprise A.  Why?  

I don't know but I can think of a possibility.  The amount of the cost of 
quality, safety, reliability, and security that computer system engineers 
are willing to absorb is not perceived by them to be sufficient to make a 
difference in the system's customer's buying behavior.  Thus, Product A will 
sell just as well as Product B but A's engineers will do much better than 
B's engineers.  Thus, I don't think the Japanese approach will work with
computer systems.

The only alternative is that the system's customer absorbs the additional 
cost.  What are computer system buyers willing to pay for quality, safety,
reliabity, and security?  The marketplace evidence says to me that without
any downside risk for their absence, damn little. Thus, as long as we are 
willing to accept the excuse that the computer malfunctioned and no one is 
to blame then we can't be expected to be asked to build high quality, safe, 
reliable and secure systems.  

MORAL: If you want build good systems then you either have to be willing
to absorb more of the cost of doing so or be willing to accept liability 
for not doing so.  If you are willing to do neither, then you should expect
to only build low quality, unsafe, unreliable, and insecure systems ... and
that's just what we're doing.  "We have met the enemy and he is us." Pogo

------------------------------

Date: 21 Aug 87 16:31:42 PDT (Friday)
From: bfisher.ES@Xerox.COM
Subject: Re: Bank Computers and flagging
To: RISKS FORUM (Peter G. Neumann -- Coordinator) <RISKS@csl.sri.com>

Joe Herman's account was most interesting, especially in re the ATM not
reflecting the true balance.  I withdrew some bucks via an ATM on a Friday -
and a few days later, Tuesday - had occasion to withdraw some more. The
receipt after Tuesday's withdrawal showed a balance significantly higher
than the amount prior to Friday's transaction. I double checked and found no
EFT or normal deposits outstanding. I called the hot line and was told that
I probably had transacted during a "float adjustment" period and that the
amount indicated on Tuesday's receipt was not actually there. I was then
informed that if I had transacted a few minutes later the receipt probably
would have indicated the "real" amount.  A couple of other minor incidents
similar to this have convinced me that I must be going right down the rabbit
hole every time I enter a bank parking lot. I don't remember signing an
agreement that they could fiddle with my account for adjusting anything.

Bill Fisher

------------------------------

End of RISKS-FORUM Digest
************************
-------