idhopper (12/08/82)
I read Rick Eblaw's comments on "The Changing Face of Micro-computing" and was inspired to make some comments thereupon. Quotations from his item are indented. As a budding Computer Science student, I recognize that guys like myself are (at least in part) responsible for the proliferation of 'personal computers' and the resultant software After all, did we not claim that the computer was not to be feared, and was such a useful tool that we would soon be using it like we presently use our automobiles? In fact, we went further than that, we said that our up-and-coming products would be so easy to use that a child could become quite proficient in their use, and the child would learn more, etc. It almost follows logically that if a child could use and benefit from such a tool, then the parent could benefit even more, and "Look, honey, this wonderful tool is now quite affordable! We owe it to ourselves and our children to buy one." Yes, we will soon be using computers like we use automobiles (well, hopefully with a little more care), but "soon" isn't here yet. The results of many years of research at places like Xerox PARC and MIT-LOGO/AI are just starting to become commercially available, and hopefully in a few years everyone will be able to afford such stuff. The problem comes from confusing these systems (which really are simple enough for children to use) with the kind of computer most families (and universities) can afford ("well, it IS a computer, isn't it?"). Between a 4K VIC-20 running BASIC, or even a VAX-11/780 running UNIX, and a Dorado running Smalltalk there isn't just a difference in degree, there's a difference in kind. We claimed that everyone and anyone could learn to program this wonderful tool, and we knew that was a half-truth.... Every semester, there are a large number who have trouble dealing with FORTRAN. A recent exam question asked the student to write a program segment which would print the word HELLO. Some of the answers we recieved were quite hilarious, but very sobering to one who believed that programming could become one of the skills of the masses. In my opinion, the problem is not intrinsic to programming, it is intrinsic to FORTRAN. I have never done any serious FORTRAN programming, and I hope I never will; the language is simply unsuited to such a task. In one of the Creative Computing features on actor languages (~Oct/Nov 1980) Ted Nelson told an anecdote about a Computer Science prof who had on his door: "Department of Notational Engineering"; I consider this very apt. A programming language is simply a notation that makes it easy for us to tell a computer what we want it to do, although to date the emphasis has been on the listener (computer) rather than the speaker (person). I do not think that there is any fundamental reason why someone who can write the word "Hello", and can tell someone else how to do it, should not be able to tell a computer how to do it. The very fact that such a trivial task is on an exam shows the sorry state of the art. To my mind, the beauty of the Microcomputer Revolution is that many people who would not have discovered their own talent for programming can now indulge in it. To MY mind, the beauty of the Microcomputer Revolution is that it will make the process we call programming (trying to cram your thoughts and ideas into the straitjacket of your average computer language) disappear almost entirely, to be replaced by smooth and artful interaction: Computopia. Until then... --ravi
mwm@Okc-Unix@sri-unix (12/10/82)
From: Mike Meyer <mwm@Okc-Unix> I'm sorry, but I just can't swallow programming as something that will become as common a skill as driving. It's not that I think that the masses are incapable: I think that anybody can learn to program with proper training. Just like I think that anybody can become a lawyer, or a molecular biologist, with proper training. The problem is in the training. Programming (as it should be practiced) is an engineering field, though not as precise as the more established fields. The problem of people failing to answer seemingly trivial problems is not a problem in the language, but in the training. For example, on a recent quiz the problem "Write a program that echos its arguments" was asked. This language was C in a Unix environment. This problem is tailored to that language and environment. However, there were not only wrong answers, there were people who didn't answer the question. The proper training requires to much time for most people to undertake it for pleasure. Likewise, to stay properly oriented, you need to keep track of the literature. See @u(Software Reflected [Reflections?]) by (blast, where did I put the thing - sorry, no author) for more on this topic. However, people will try such things without training. Some of them succeed. A lot of them don't (I've spent a LOT of hours helping non-CS types out of holes). In addition, it is not at all difficult to get better training by buying a small system and playing with it than you can get from many universities. Just to add to the confusion, the industry needs people so badly they'll hire ANYONE who can code. Please note that I don't wish to undermine the affect of ametuers in the field. They have made significant contributions, and will probably continue to do so for quite a while. CS may be unusual in that it's one of the few sciences were the cost of doing usefull work went down instead of up. But (quickie survey:) how may hobbiest (or non-hobbiest, for that matter) 1) know the difference between a) bubblesort b) insertion sorts c) heap sorts d) bucket sorts e) quicksort 2) Would know which one to use on a) 10 elements? b) 100 elements? c) 10,000+ elements? 3) Know the three canonical tree walks? 4) Know how to find the time complexity of a program? Now, back to denotations and languages. New languages won't lower the level of ability needed to handle large systems. It will just raise the size of a system considered large, by making them all easier to handle. People will still try and tackle things that are two large, and fail. We'll just be able to succeed on larger things [Hopefully, programming techniques will also improve - but isn't a language just a mechanism for supporting techniques?]. The size you can get away with will depend on your ability. And a properly trained person (either self-trained or university trained) will be able to handle larger things than an improperly or untrained person (once again, whether self-untrained or university untrained). As an anology, I might be able to build a bridge over a small creek that would do the job. But there ain't no way I could bridge the Tokoma (sp?) Narrows. The changes we are seeing, and hopefully a lot of the problems, are due to Computer Science being a new field (<200 year in theory, <40 in practice). So new that most unversities don't even recognize the (minimally) three different disciplines of theoretical and applied CS, and programming. Hopefully, the future will see universities universally teaching ALL the disciplines involved with producing software in an effective manner, and a resulting rise in both the percentage of succesfull projects and the size of the tackled projects. Plus, of course, a major increase in the power of languages to support our ideas . Sorry to have been on the soapbox for so long. <mike
idhopper (12/12/82)
In response to Mike Meyer's response to my comments: Thanks for your comments. You've made me take a critical look at my thoughts on the matter of programming; nonetheless, I still agree with most of what I said. I did not say that programming computers would become as common as driving a car; I said that *using* them would be. However, with a well-designed language in which one does not have to continually re-invent the wheel, someone who simply uses a computer will have about as much power and flexibility as most programmers have today. Consider your sorting example: it would be possible (nearly trivial, in fact), in Smalltalk, to have an array automagically decide which sort algorithm to use when it receives the "sort" message, using what it knows about itself to pick the most efficient method. At this point the "driving a car" analogy becomes more useful: there are drivers, and there are car designers. It takes a certain amount of training to become proficient in using a car, and much, much more to become proficient in designing one. However, once a car designer has used his extensive training to endow a car with certain capabilities, the driver of that car can use and combine them to fulfill his own needs; many of these will not have been foreseen by the designer but are still possible because of the flexibility built into the system. An idea being explored by the Software Concepts Group at PARC (formerly the Learning Research Group) is that of "kits": a designer creates a set of modular parts for solving a certain kind of problem; the user of the kit then combines those parts in whatever way he sees fit (see "Introducing the Smalltalk-80 System, Adele Goldberg, August 1981 Byte). This is the kind of thing that I am thinking about when I talk about future "programming" systems -- a few people design the kits, and everyone else uses them. I do disagree with your assertion that programming is an engineering field. Any computer system has to interact with people, and this interaction is primarily at the conceptual level where hard-and-fast engineering rules do not exist. Therefore, choosing how to interact with that person is more than an engineering problem. I think that a more similar field is architecture or, to a lesser extent, industrial design: to produce a good interactive system one must balance both technical and artistic considerations. The technical considerations may limit what can be done, but the fundamental guiding principle is by nature artistic. --ravi
mwm@Okc-Unix (12/17/82)
From: Mike Meyer <mwm@Okc-Unix>
Hmm - it seems like I misread what you said vis-a-vis using vs. programming
computers. The analogy (as far as I can tell) works more like so:
Nearly everybody can drive a car - to some extent. Likewise, nearly everybody
can use a computer - to some extent. To drive a car skillfully enough to
avoid damaging things require practice. Likewise, to get the most from the
software you already have requires practice. Finally, to really use a car
requires years of practice and study (just ask Jackie Stewart). Ditto for
programming a computer (and the two are more alike than they look.) The analogy
breaks down with the idea of `kits.' I'm not sure how that will affect things,
but I predict that there will still have to be people who can build with the
kits to set things up for the end users. This is based on watching business
people struggle with CP/M and various menu-driven systems that run on it.
The end result will be that there are still programmers - they just have better
tools.
As for programming being an engineering discipline, I still maintain that to
be a tautology. Designing the interface is like architechture, but said
interface is part of the INPUT for a software engineer. The actuall problem
for said people is:
Map command language X to action set Y over data Z.
>From that point of view, programming is engineering. The level where you
design the command language, choose the hardware, and (hopefully) examine
the feasability of the project is another level up, and does, indeed, look
more like architechture than anything else.
<mike
P.S. - sorry for the s/what incoherent nature of my last response. I found
out (afterwards) that the line I'm using was eating characters, etc, and
wasn't validating my input. This message should be better.
mwm