risks@CSL.SRI.COM (RISKS Forum) (12/19/90)
RISKS-LIST: RISKS-FORUM Digest Tuesday 18 December 1990 Volume 10 : Issue 69 FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator Contents: The RISKS Forum is moderated. Contributions should be relevant, sound, in good taste, objective, coherent, concise, and nonrepetitious. Diversity is welcome. CONTRIBUTIONS to RISKS@CSL.SRI.COM, with relevant, substantive "Subject:" line. Others ignored! REQUESTS to RISKS-Request@CSL.SRI.COM. FTP VOL i ISSUE j: ftp CRVAX.sri.com<CR>login anonymous<CR>AnyNonNullPW<CR> CD RISKS:<CR>GET RISKS-i.j<CR>; j is TWO digits. Vol summaries in risks-i.00 (j=0); "dir risks-*.*<CR>" gives directory; bye logs out. ALL CONTRIBUTIONS CONSIDERED AS PERSONAL COMMENTS; USUAL DISCLAIMERS APPLY. Relevant contributions may appear in the RISKS section of regular issues of ACM SIGSOFT's SOFTWARE ENGINEERING NOTES, unless you state otherwise. ---------------------------------------------------------------------- Date: Tue, 18 Dec 1990 10:09:53 PST From: JON@GAFFER.RAD.WASHINGTON.EDU (Jon Jacky) Subject: "Computer Models Leave U.S. Leaders Sure of Victory" That's the headline for this story from THE SEATTLE TIMES, Dec. 17, 1990, p. A3: COMPUTER MODELS LEAVE U.S. LEADERS SURE OF VICTORY by Robert C. Toth, Los Angeles Times ... Computer models of ground warfare convince the administration it can deliver on its promise of an overwhelming victory. An Army assessment of US and Soviet-made Iraqi equipment --- from tanks to rifles --- shows that the United States has an edge in quality to compensate for its numerical disadvantage. ... When such assessments are factored into opposing ground-combat units and the forces are pitted against each other in war games, the conclusion by Pentagon and many non-government experts seems to be the same: "We'd crush them," said Joshua Epstein of the Brookings Institution. Iraqi numbers, including its million-man army, should not be a problem, added Barry Posen of the Massachusetts Institute of Technology. "If anything, we might begin to address the ethical question of how much slaughter you want to inflict on his forces if war comes." ... Computer modelling for a ground war is based on assessments of the U.S. Army War Gaming Agency of the combat value and combat effectiveness of 10 types of weapons. ... Effectiveness ratings were determined for (... tanks and ...) other weapons categories from artillery to small arms. A rifle is valued at 1.0, a machine gun at 1.77, a 155 mm howitzer at 1.02 and a MLRS rocket system at 1.16. (... etc. ...) The combat value, or relative weight, of the weapon categories were decided by a team of experienced battlefield commanders. Most valuable, they decided, are attack helicopters, followed by artillery, tanks, scaling down to small arms. When the number of weapons in a U.S. armored division is multiplied by those effectiveness ratings and their combat value, the resulting total --- about 130,000 --- becomes the Armored Division Equivalent at 1. All other units can be measured against that standard. A U.S. infantry division, for example, is given an ADE of 0.5, or half an armored division. Epstein and his associate, Alf Hutter, calculate that all U.S.-led forces in Saudi Arabia will be valued at 17.6 ADE's by February, when the buildup is completed. They calculate that Iraqi forces in Kuwait will be valued at 7.4 ADE's, and those in northern Kuwait and southern Iraq at 9.6 ADE's. That appears to place opposing forces in balance, but the Iraqi forces are widely dispersed in defense, offering a challenge of only 2.3 ADE's for U.S. forces in the "main attack sector." Epstein's bottom line, based on his modeling, is an 18-day war. The first six days would be used for air strikes to establish control and soften up Iraqi ground forces. In the next six days, the ground attack, breakthrough and movement northward would take place. The final six days would be used to mop up. Casualties would be about 15,000 (with 25 percent dead), he said, although they could range from 3,600 to 22,000. Posen said such models understate the U.S. advantage because they do not reflect the better training, logistic supplies, command and control, and other qualitative edges. "We will have total control of the air," he said. ... U.S. forces wil be able to concentrate at their intended attack point to reach a jump-off advantage of 5-1, Posen said, because of the U.S. expectation that Iraqi artillery --- the usual weapon to prevent such concentrations behind enemy lines --- will be largely wiped out. "We are very good at counter-battery fire," he said, pointing to special radar to locate any Iraqi artillery batteries that could then be assaulted with massive, rapid-fire artillery weapons and the Multiple-Launch Rocket Systems. One MLRS volley of 12 rockets is supposed to have the same effect as 72 rounds of 155 mm howitzers. "In just artillery alone, we figured we could delivery 500 tons of metal (artillery shells) on his positions in only one hour," Posen said. "We astonished ourselves with that figure." [ And so on. There were a few caveats: "There are doubters... skeptics remain unpersuaded... experts .... warn that unpredictable events could turn the most modern technological projections into catastrophe ... - JJ] - Jon Jacky, University of Washington, Seattle jon@gaffer.rad.washington.edu ------------------------------ Date: 11 Dec 1990 14:35 -0500 (Tuesday) From: rees@citi.umich.edu (Jim Rees) Subject: Re: Airline safety In RISKS 10.67, Donald A Norman writes a sermon on the Economist article concerning airline safety. He apparently didn't read the article itself and I wanted to clear up some things. The suggestions in the Economist article were not advanced by the Economist. They were from Mr. Earl Weener, safety chief at Boeing. Weener did not urge the introduction of a new decision speed. He said that in two out of three accidents involving a go/abort decision on takeoff, the decision was wrong. [ I find this hard to believe, since tossing a coin would give the correct answer more often. Maybe he just means the calculated V1 speed was wrong, not that the decision was wrong. ] Weener suggests improving the Ground Proximity Warning System (GPWS). The current GPWS is apparently not trustworthy, and accidents are caused when pilots ignore its warnings. Improving this system to the point where pilots feel they can trust it seems like a good idea to me. Boeing's four suggestions -- better estimates of V1, better GPWS, ILS installed at more airports, and better use of flight data recorders -- all seem to me to be good ideas, although I also think that Mr. Norman is right in suggesting that a more comprehensive, systems approach is needed. ------------------------------ Date: Tue, 11 Dec 90 09:27:53 EDT From: Jerry Leichter <leichter@lrw.com> Subject: The Incredible Lightness of Reference My article on executives and PC's indicates the issue of Business Week that it was published in. Since Business Week is widely available, you can dig up the original article if you want more details or have doubts about the accuracy of my quotation. Or can you? Like many mass-market periodicals today, Business Week takes advantage of the flexibility of computer and printing technology to produce different editions for different audiences. The article I quoted appeared in the Bits and Bytes column of the Information Processing section. That section is part of the "Information Processing" pages specific to the "Industrial/Technology" edition that I receive. It may or may not appear in other editions; if it does appear, it may appear on different pages, under different headings, or even on different dates. Business Week has its own algorithms for deciding which edition to send you - they don't ask. (I suppose if you ask to be receive a particular one, they'll put you on the appropriate list.) Anyone who has written an article citing information gleaned from a network posting knows that traditional citation techniques are not adequate for this new medium: Even a citation to a respected, broadly-read moderated list like RISKS would be very difficult for a traditional librarian to deal with, and most network postings are evanescent, archived nowhere and impossible to examine, much less verify, after the fact. What's slipped in unnoticed is that the same technological mismatch has begun to apply for the seemingly traditional paper forms. Today, it's newspapers and magazines. Tomorrow, textbooks will be tailored to individual school districts, individual schools, even individual classes - a technology all the major textbook producers are working hard at introducing. Given that technology, perhaps we'll soon see different editions of novels, even of non-fiction works, tuned to regional differences in interest, dialect, social mores, or what have you. The world network is supposed to be bringing us all together. In some ways, however, the same techology is acting to fragment our world: If everything is "narrow-cast" to more and more finely subdivided audiences, what do we share? -- Jerry ------------------------------ Date: Tue, 11 Dec 90 09:04:08 EDT From: Jerry Leichter <leichter@lrw.com> Subject: Unexpected effects of PC's For Many Executives, PC's Mean More Typing -- and Less Managing Some computer technology, such as automated teller machines, has made life easier. But in business, that's not necessarily so. A Georgia Institute of Technology study found that personal computers can make life harder for managers. The reason? When companies install PC's, they often cut back on support workers. So middle managers now spend a third of their time performing administrative tasks and only 25% managing, the study found. "Companies think they're going to try to pay for technology by letting secretaries go," says Peter G. Sassone, the Georgia Tech economist who wrote the study. "But someone has to do the typing, filing, and copying." He says corporate buyers mistakenly think PC's can deliver the same degree of productivity improvement that mainframes brought to inventory and payroll jobs in the 1960's. And "computer companies still try to sell that same idea," he adds. Ten to fifteen years ago, when word processing systems were starting to spread rapidly, I (and others) pointed out an interesting paradox: These systems were sold (and usually bought) as money savers. The idea was that you could get more work done with a smaller number of typists. In practice, the number of typists was unchanged; what actually changed was the quality (in some often hard to perceive sense) of the output: Since making changes was now so cheap, documents would go through many more revisions. Well, it took a while but the original "purpose" of that equipment has reasserted itself - and this time it is being attained by management fiat. Middle managers are particularly vulnerable since information processing technology has put so much pressure on them anyway. (I saw an estimate not long ago that there are some 25% fewer middle management jobs now than there were in 1985.) However, the trend is much broader. My wife is a lawyer who made the transition from a large Manhattan firm to an in-house counsel position for a very large industrial company about a year ago. One of the changes she had to adjust to was the lack of support staff: There is one secretary for three lawyers and several other professionals, and each person is expected to do most of the word processing for his own jobs. The comment from a friend - also a lawyer, but one very involved in the future of the profession - is that any lawyer starting out today had better learn to use a word processor; it'll be part of his job within a few years at most. -- Jerry ------------------------------ Date: Sat, 8 Dec 90 16:31:11 EDT From: Jerry Leichter <leichter@lrw.com> Subject: Long-distance printing, or the risks of being well-known In a recent RISKS, Hank Nussbacher reports on printouts that were intended for a local printer in North Carolina but, due to a one-character error in specifying the receiving node, were regularly being printed in Israel. He's inspired me to write this note, which I've been meaning to get down one paper for quite some time. There's an interesting class of risks in computer systems, particularly networked systems, that I call "the risks of being well known" - though I suppose "the risks of knowing too well" is better :-). The underlying problem is that the ability to easily address and reach a huge number of systems, without any built-in testing of the reasonableness of requests, can lead to some very interesting failures. 1. Digital sells a printer server known as the LPS40. This is a Postscript printer you stick on your Ethernet and then print to from a number of other machines on the net. In order for your machines to be able to find the printer, you have to give it a DECnet node name. In DECnet as it is today, the namespace involved is flat. The LPS40 documentation had many examples in which the printer was addresses as node LPS40. If you don't think things through, and simply type the example startup commands as given, you will have a series of machines trying to send output to a printer named LPS40. This, indeed, happened at a number of sites at DEC several years ago. If your local DECnet configuration has never heard of node LPS40, your attempts to start the software will fail. However, one of the first LPS40's at DEC, installed in Hudson, Mass., reserved that name; so it was in the standard configuration database distributed throughout the company. As a result, new sites all over the world found themselves printing files in Hudson. As it happens, the protocol used for talking to an LPS40 is officially supported only over Ethernets, and won't work RELIABLY over wide-area nets - but it will work SOMETIMES. When I heard this story, the record for long-distance printing was from somewhere in Georgia. 2. I'm not sure where I heard the following story; details would be welcome. The Andrew system, developed at CMU, provides a variety of network file services. At first, it was used only at CMU; later, CMU started distributing it to other universities. One university got the source code and started doing local modifications. Then they found their modifications mysteriously being removed - the system somehow migrated back to its old state. Apparently, included with the source package was software to ensure that local copies of the system stayed up to date. On a regular (nightly?) basis, the software checked for local files that differed from those on a "reference" machine, and brought over "reference" copies if necessary. Unfortunately, the "reference" machine was hard-coded as some machine at CMU! 3. Much simpler, but much more widespread: There have been many reports of FAX messages inadvertently sent to the wrong destination. -- Jerry ------------------------------ Date: Tue, 18 Dec 90 14:07:27 EST From: hoffman@eesun.gwu.edu (Lance J. Hoffman) Subject: Organizational Aspects of Safety RISKS readers might be interested in "Organizational Aspects of Engineering Safety: The Case of Offshore Platforms" by M. Elisabeth Pate'-Cornell in SCIENCE Magazine, p. 1210 ff. of 30 November 1990. It describes how, while organizational errors are often at the root of failures of critical engineering systems, engineers tend to focus on technical solutions, in part because of the way risks and failures are analyzed. But, for example, in some systems described, improving design review costs two orders of magnitude less than adding steel to structures (the technical fix) to gain the same improvement in reliability. Prof. Lance J. Hoffman, Dept. of Electrical Engineering and Computer Science, The George Washington University, Washington, D. C. 20052 (202) 994-4955 [Also noted by haynes@ucscc.UCSC.EDU] ------------------------------ Date: 18 Dec 90 17:39:19 From: hunter@work.nlm.nih.gov (Larry Hunter) Subject: Covert communication through public databases This is not all that new, but I haven't seen discussion of the issue in RISKS, so I thought I would post excerpts from an interesting InfoWeek article (26 Nov 1990, pp.12-13): [A] handful of major airlines - including American, United, Delta, and TWA - are being sued by almost three dozen plaintiffs who allege that the airlines use a shared database to "fix" prices and circumvent ... competition.... The ATPCO [Airline Tariff Publishing Co] database, critics allege, has become an electronic forum wherein airlines communicate with each other to keep ticket prices artificially high by discouraging competitive fares. A number of techniques are used by the carriers to signal one another, insiders say; for example, if a regional airline drops prices on a given route in an effort to boost traffic, a larger airline may slash fares on its flights in and out of the regional airline's hub airport, sending a strong signal that it disapproves of the smaller airline's new fares. The larger airline's low fares may remain in effect for only a day or two, but the other airline gets the message.... One airline spokesman acknowledged that airlines watch the database and closely respond to competitors' actions, but he calls that "the dynamics of the industry, not price fixing." Ian Ayers, a faculty member at the law school at Northwestern University and a specialist in anti-trust cases, says: "The issue is, are the airlines just sharing data, or are they going beyond that and [through the database] talking about what they are going to do about the data?" "Back in the good old days," says John Timmons, minority counsel to the Senate Commerce Committee and a close monitor of the airline industry, "if you were going to fix the price of something like steel, you'd make a phone call. Today, you'd use technology, but to my way of thinking, it's just like that phone call."... Lawrence Hunter, PhD., National Library of Medicine, Bldg. 38A, MS-54 Bethesda. MD 20894 (301) 496-9300 hunter%nlm.nih.gov@nihcu (bitnet/earn) ------------------------------ End of RISKS-FORUM Digest 10.69 ************************