lacey@batcomputer.tn.cornell.edu (John Lacey) (07/29/89)
My school is currently using an old VAX (11/750) and VAX Pascal in its CS courses. In the last 2 years, one of the professors (the best one :-) ) has offered a course using Abelson & Sussman^2 as the text, with TI's PC Scheme on 8088/MS-DOS machines. This course, however, is offered as an upper level elective. At this years SIGCSE course, there was a talk about using Lisp, and in particular Scheme, as a first programming language, that is, in the CS1 and CS2 courses. I am a senior; a member of the university's computer policy committee, chair of the student math/cs board, and founder of the TeX Users Group. The math/cs board, together with the department faculty, is looking at replacing Pascal (and perhaps the VAX) as the main programming language. I would be interested in hearing from everyone about what they think of such a move, and what language they feel is the best to use. My own predjudices are to use Scheme. Another choice, considerably more conservative, would be Modula/2 or Oberon. Ada, to my own taste, is completely out of the picture. What say all of you? P.S. What about comp.lang.paradigms? I would be very interested in a discussion about the usefulness of particular paradigms, especially as related to their effectiveness as teaching instruments. (Professionals are usually adept enough to fit the correct (or one of the better) paradigms (and a particular language associated with it) to the job at hand.) -- John Lacey | cornell!batcomputer!lacey lacey@tcgould.tn.cornell.edu | lacey@crnlthry.bitnet
manis@grads.cs.ubc.ca (Vincent Manis) (07/29/89)
In article <8514@batcomputer.tn.cornell.edu> lacey@tcgould.tn.cornell.edu (John Lacey) writes: >My own predjudices are to use Scheme. Another choice, considerably more >conservative, would be Modula/2 or Oberon. Ada, to my own taste, is >completely out of the picture. I think that deciding upon the language to use first is most definitely the wrong paradigm. Imagine deciding first whether to use alternating or direct current when teaching physics! You should look at what you want your course to accomplish before deciding upon the particular tools to use. A first-year major-level science course ought to try to present the discipline as an integrated whole, covering not just the current consensus, but also the historical process by which that consensus was developed, and the questions which researchers in the field are currently investigating. It should also try to lure outstanding students into that field. Teaching programming is clearly going to address none of these issues, as the report of the ACM Task Force on the Core of Computer Science (Jan CACM, I believe) states forcefully. Rather, we should concentrate upon characterising what computer science is, which in my mind constitutes such concepts as abstraction and automata fully, and demonstrating how these concepts intertwine. Applications are important, but they should be significant ones, not just producing paycheques (mea culpa! mea culpa maxima!). Not surprisingly, I consider Abelson and Sussman a seminal book in the field, because it does exactly what I have stated. Abelson and Sussman happen to use Scheme, and therefore, if one is using their book, the laboratory work (*not* programming assignments!) would be done in Scheme. I can imagine using Prolog or Oberon, but Scheme is already such a good candidate that there seems no point. We at UBC expect to switch over to Abelson and Sussman fully in the fall of 1990 for our major students. The final negotiations for getting University approval are at present just getting started. There is, however, the non-major population, who clearly neither want nor need Abelson and Sussman. With this group, programming is clearly not the major issue, and Pascal is quite suitable. We're looking at the UC Berkeley computer literacy course, `Computing Unbound', to serve this group. ____________ Vincent Manis | manis@cs.ubc.ca ___ \ _____ The Invisible City of Kitezh | manis@cs.ubc.cdn ____ \ ____ Department of Computer Science | manis%cs.ubc@relay.cs.net ___ /\ ___ University of British Columbia | uunet!ubc-cs!manis __ / \ __ Vancouver, BC, Canada V6T 1W5 | (604) 228-2394 _ / __ \ _ "There is no law that vulgarity and literary excellence cannot ____________ coexist." -- A. Trevor Hodge
ron@woan.austin.IBM.COM (07/29/89)
Cal Berkeley switched over to Scheme a few year ago for its intro class, and I found that it was a better first language than the ALGOL derivatives (Pascal,Fortran,C) because it teaches functional programming and recursion, as well as data structures. At least if the instructor follows Abelson and Sussman. Then again any language is probably fine if your instructors are willing to make the extra effort to introduce those topics early on, so students will be less confused later. -- ********************************************************************************* * Ronald S. Woan at IBM-Austin * * ron@woan.austin.ibm.COM WOAN AT AUSVMV * * Division 75 Department E63 *
tka4092@athena.mit.edu (Terry Alkasab) (07/31/89)
In article <8514@batcomputer.tn.cornell.edu> lacey@tcgould.tn.cornell.edu (John Lacey) writes: >My school is currently using an old VAX (11/750) and VAX Pascal in its >CS courses. In the last 2 years, one of the professors (the best one >:-) ) has offered a course using Abelson & Sussman^2 as the text, with >TI's PC Scheme on 8088/MS-DOS machines. This course, however, is offered >as an upper level elective. > >At this years SIGCSE course, there was a talk about using Lisp, and in >particular Scheme, as a first programming language, that is, in the >CS1 and CS2 courses. > >What say all of you? > I just finished my freshman year at MIT, and though I am not a Computer Science major, I took the first class in the Computer Science sequence (6.001: Structure and Interpretation of Computer Programs, taught by Hal Abelson :-)!!). Many freshmen do; perhaps half the class in non-majors (there are perhaps a couple hundred in the class.) A *huge* percentage of the class already had programming experience (in whatever language: BASIC, Pascal, C, or what have you) and thus were not learning *programming* per se, but programming *theory*. As an interesting note, on the first day, the professor asked how many students had no previous programming experience. When a smattering of the class raised its hand, the professor reassured them saying that having no experience might prove beneficial, since learning the stuff he was about to teach would require many of their classmates to unlearn stuff they had long since learned. This was supported by a friend of mine who had problems with the class at first because she said she was trying to translate everything she did into BASIC. Naturally, the class used Ableson & Sussman, and it was quite an interesting experience. Many of the concepts I was asked to deal with, many of (what I understand to be) the important concepts in computer science were introduced in a deft manner through the use of Scheme. I could probably translate these ideas into use in other languages (in fact, I do on a daily basis), but trying to introduce them in, say, Pascal...I think it would have been a whole lot harder. Further, Scheme is a whose syntax is quite simple, and is *very* easy to pick up. The idea behind the class was *not* "Let's learn *another* programming language!" but "Let's learn computer science." And if a language is chosen which requires more attention than the concepts it is meant to introduce, then the entire purpose of the exercise is being defeated. My conclusion: as an introduction to computers, Scheme has nothing which immediately recommends itself, IMHO. However, as an introduction to *computer science*, Scheme worked extremely well in my case. (Truth to tell, I kind of like the language. Lots of fun stuff you can do!) --Terry "Know thyself." --Socrates "No thyself." --Zen's answer to Socrates DISCLAIMER: If MIT doesn't like what I say, it's their own damn fault for letting me have an opinion. Terry "Dweebie" Alkasab
gentile@horsey.dec.com (Sam Gentile) (08/01/89)
In article <13158@bloom-beacon.MIT.EDU>, tka4092@athena.mit.edu (Terry Alkasab) writes... >In article <8514@batcomputer.tn.cornell.edu> lacey@tcgould.tn.cornell.edu (John Lacey) writes: >>My school is currently using an old VAX (11/750) and VAX Pascal in its >>CS courses. In the last 2 years, one of the professors (the best one >>:-) ) has offered a course using Abelson & Sussman^2 as the text, with >>TI's PC Scheme on 8088/MS-DOS machines. This course, however, is offered >>as an upper level elective. >> >>At this years SIGCSE course, there was a talk about using Lisp, and in >>particular Scheme, as a first programming language, that is, in the >>CS1 and CS2 courses. >> >>What say all of you? >> I have been a Software Engineer for 4 years now first in the DOD world at Raytheon and now working on network applications. I have written the majority of my code ( 90%) in C with the remainder being Raytheon assembly, MACRO-32, DCL and FORTRAN. The first language I was taught as a Computer Engineering student was FORTRAN and then I learned DEC MACRO-32. Allthough I love C and do most of my work in C, I don't think it is a good first language to learn for a student. I found C very difficult at first and I had a lot of problems with it. Also some C coders write code that is completely un-readable. I think BASIC should be abolished and certainly not taught as a first language. It gives people very bad first habits. I don't think FORTRAN is a good first language either. I think PASCAL is still the ideal first language for a student. It will expouse the student to the concepts of pointers in a more friendly way than C and teach structured programming habits. These are just my opinions on the subject. I would like to hear what other people have to say on the subject. Sam Gentile gentile%horsey.dec@decwrl.dec.com Software Engineer decwrl!horsey.dec.com!gentile Digital Equipment Corp Software Services Engineering - Network Engineering 5 Wentworth Drive GSF1-1/G13, Hudson NH 03051 ----------------------------------------------------------------------------- DEC is Number ONE in Networking!!! ----------------------------------------------------------------------------- The views expressed are my totally my own and do not reflect the views of Digital Equipment Corp.
djones@megatest.UUCP (Dave Jones) (08/01/89)
From article <8514@batcomputer.tn.cornell.edu>, by lacey@batcomputer.tn.cornell.edu (John Lacey): ... > The math/cs board, together with the department faculty, is looking at > replacing Pascal (and perhaps the VAX) as the main programming language. So far, so good. > I would be interested in hearing from everyone about what they think of > such a move, and what language they feel is the best to use. > I have some limited experience in this matter, having taught several introduction to CS courses at the freshman and sophomore level, during a short stint as visiting prof at a medium large midwestern university. It is my considered opinion, arrived at after much thought, that it depends. [He waits for the laughter to subside...] I think there should be two courses: one for CS majors, one for non-majors. Teach the course for non-majors in a relatively high level language. Teach the course for CS majors beginning with a toy assembly language, then moving to C, and then (only then), consider LISP variants, unification languages, etc.. I don't hold with the popular notion that a beginning programmer will be ruined by understanding how computers really work, or that his brain will misfire if he finds out what the "high level" languages really do, and why. The idea that the student will "form bad habits" is specious. The student will better understand the higher constructs, having rediscovered why they were invented. I think teaching intro to CS starting with a language which has embedded lookup-tables and automatic garbage collection is very confusing to the student. > > ... > > P.S. What about comp.lang.paradigms? > I am weary with way computer jargon changes the meanings of perfectly good English words. paradigm: EXAMPLE, PATTERN; esp. an outstandingly clear or typical example or archetype. -- [Merrium] Webster's Ninth New Collegiate Dictionary Is that what you mean?
ttwang@polyslo.CalPoly.EDU (Thomas Wang) (08/01/89)
I would suggest the language Eiffel may be a good candidate for the first language. Eiffel is an industrial strength language, the heavy emphasis on abstract data types should fit nicely into a first year course. -Thomas Wang ("I am, therefore I am." - Akira ) ttwang@polyslo.calpoly.edu
reggie@dinsdale.nm.paradyne.com (George W. Leach) (08/01/89)
In article <13158@bloom-beacon.MIT.EDU> tka4092@athena.mit.edu (Terry Alkasab) writes: > I just finished my freshman year at MIT..... >A *huge* percentage of the class already had programming experience >(in whatever language: BASIC, Pascal, C, or what have you) and thus >were not learning *programming* per se, but programming *theory*. When I left high school nearly 15 years ago and began my freshman year I too had some programming experience. We had a course in BASIC in high school that was only in its second year when I took it. While the teacher who instituted the course was an excellent math teacher, he was not qualified to teach this course. Consequently many bad habits were picked up that had to be broken in an introductory CS course. I would imagine that this is a far more serious problem these days with the proliferation of PCs and inexpensive compilers. The availability of programming courses at the primary and secondary school system level must have exploded during the past decade. I know that my younger brothers had more hardware and courses available to them than I did in my high school days. The large base of self taught hobbiests also contribute towards the problem. Computing is still a young discipline. There is such a demand in industry for the skill of programming a computer that must be filled. There are also too many programs out there calling themselves Computer Science that do nothing more than teach programming and different languages. This is especially true in the two year college programs. Certainly the situation has improved over the 70's, but we still have a long way to go. >As an interesting note, on the first day, the professor asked how many >students had no previous programming experience. When a smattering of >the class raised its hand, the professor reassured them saying that >having no experience might prove beneficial, since learning the stuff >he was about to teach would require many of their classmates to >unlearn stuff they had long since learned. This was supported by a >friend of mine who had problems with the class at first because she >said she was trying to translate everything she did into BASIC. Many studies of programmer behavior have indicated that the novice programmer thinks in terms of a specific language syntax, while with experience comes more abstract thinking without worring about implementation language details. The trick is to teach an introductory course that can avoid this pitfall. Most intro course concentrate too much on an introduction to a specific programming language syntax. One of the goals of the course is to enable a student to utilize this language in any of the advanced courses that will be encountered further down the road. Often the focus is on the programming language with little emphasis on the theory behind computer science. George W. Leach AT&T Paradyne (uunet|att)!pdn!reggie Mail stop LG-133 Phone: 1-813-530-2376 P.O. Box 2826 FAX: 1-813-530-8224 Largo, FL USA 34649-2826
alanm@cognos.UUCP (Alan Myrvold) (08/01/89)
In article <8514@batcomputer.tn.cornell.edu> lacey@tcgould.tn.cornell.edu (John Lacey) writes: >My school is currently using an old VAX (11/750) and VAX Pascal in its >CS courses. In the last 2 years, one of the professors (the best one >:-) ) has offered a course using Abelson & Sussman^2 as the text, with >TI's PC Scheme on 8088/MS-DOS machines. This course, however, is offered >as an upper level elective. I would say that Scheme is the ideal language to learn first, but having purchased TI's PC Scheme for my own 8088/MS-DOS machine, I might argue that the performance (even at 10MHz with a hard disk) makes the language nearly unusable --- it really deserves a fast '386 box -- or a workstation. I'd personally like to see Pascal disappear as a first programming language .... omitting support for separate compilation of units means that it's tough to talk about writing/using subroutine libraries (I'd rather teach Fortran or C!!!). Writing code that others can use, and using other peoples code MUST be taught early. I've heard folk who'd like to see APL as the first language learned ... but I find my own APL code hard to decipher after a few hours (even with liberal use of lamps!). - Alan A subject who is truly loyal to the Chief Magistrate will nether advise nor submit to arbitrary measures. JUNIUS --- Alan Myrvold 3755 Riverside Dr. uunet!mitel!sce!cognos!alanm Cognos Incorporated P.O. Box 9707 alanm@cognos.uucp (613) 738-1440 x5530 Ottawa, Ontario CANADA K1G 3Z4
tneff@bfmny0.UUCP (Tom Neff) (08/01/89)
There is more than one reason to learn a programming language. Some will be theoreticians, some will be systems wankers like myself, some will be applications drones. What you want for a "cherry" programming language is something that will give each of these groups something rewarding and revealing in terms of their later track. After that, they should split up and use more specifically appropriate languages. The most important thing is LEAVING OUT spurious or unhelpful concepts, like line numbers in BASIC or pointers to functions returning arrays of structures containing pointers to functions returning... in C, or about half of PL/I. :-) The simpler the better for an introductory language. All you really need to communicate to people is that a computer is something that does what you tell it to do. -- "We walked on the moon -- (( Tom Neff you be polite" )) tneff@bfmny0.UU.NET
peter@ficc.uu.net (Peter da Silva) (08/01/89)
In article <3876@shlump.nac.dec.com>, gentile@horsey.dec.com (Sam Gentile) writes: > I think PASCAL is still the > ideal first language for a student. It will expouse the student to the > concepts of pointers in a more friendly way than C and teach structured > programming habits. I agree. Pascal, as a teaching language, is close to ideal. It's pretty conventional, hard to get lost in, and while it's fatally limited in the I/O department that's not a big restriction for most teaching purposes. Some people swear by LOGO or some other very high level language. I think a wide variety of such languages should be presented to students, but only after they're familiar with conventional data and control structures. -- Peter da Silva, Xenix Support, Ferranti International Controls Corporation. Business: peter@ficc.uu.net, +1 713 274 5180. | "The sentence I am now Personal: peter@sugar.hackercorp.com. `-_-' | writing is the sentence Quote: Have you hugged your wolf today? 'U` | you are now reading"
albaugh@dms.UUCP (Mike Albaugh) (08/01/89)
From article <3876@shlump.nac.dec.com>, by gentile@horsey.dec.com (Sam Gentile): > Allthough I love C and do most of my work in C, I don't think it is a > good first language to learn for a student. I found C very difficult at first > and I had a lot of problems with it. Also some C coders write code that is > completely un-readable. I think BASIC should be abolished and certainly not > taught as a first language. It gives people very bad first habits. I don't > think FORTRAN is a good first language either. I think PASCAL is still the > ideal first language for a student. It will expouse the student to the > concepts of pointers in a more friendly way than C and teach structured > programming habits. Just a (hopefully) short remark from one of those in the trenches. While I agree with most of the above, I have a queasy feeling about Pascal as a first language. I have been bitten _hard_ by three systems now in _very_ similar ways. Each was programmed in Pascal and each (despite high price and "professional" reputation) had a totally rookie-level bug in it. While not identical, the bugs all boiled down to the programmer not checking for overstepping the assumed boundary of some resource (disk space, pool memory, etc.) I can not help but suspect that people who learn programming first in Pascal have been subconsciously trained that "the compiler will not let me exceed array bounds" and carry that belief into areas where it is patently false. That said, I have no real objection to people learning Pascal in an introductory course, provided it can be _gauranteed_ they will learn some other language before writing any code that someone else will have to use. Of course, the same could be said for 8086 assembler :-) Those are just (some of) my opinions on the subject. I would also like to hear what other people have to say. BTW, I just finished a re-write of a program which, while nominally written in C served as more proof that "Real Programmers can write Fortran Programs in any language" :-) | Mike Albaugh (albaugh@dms.UUCP || {...decwrl!turtlevax!}weitek!dms!albaugh) | Atari Games Corp (Arcade Games, no relation to the makers of the ST) | 675 Sycamore Dr. Milpitas, CA 95035 voice: (408)434-1709 | The opinions expressed are my own (Boy, are they ever)
mccalpin@masig3.ocean.fsu.edu (John D. McCalpin) (08/02/89)
In <see above> reggie@dinsdale.nm.paradyne.com (George W. Leach) writes: > Many studies of programmer behavior have indicated that the >novice programmer thinks in terms of a specific language syntax, while >with experience comes more abstract thinking without worring about >implementation language details. [...] The trouble occurs when the poor students spend too many years with abstraction and forget that the purpose of the exercise is to solve a problem --- and that has to be done in a specific language with a specific syntax. A sad example of this was the "GOTO" war last year (or maybe the year before) in the Communications of the ACM. In a patronizing letter, an eminent computer scientist (whose name will remain unmentioned) gave the "correct" solution to a simple problem that had been batted back and forth as an example of a construct that was made *easier* to read with a GOTO statement. The problem was that the "solution" was not written in an existing programming language, but in a very obscure pseudo-code.
rterek@shrike.Sun.COM (Robert Terek [Contractor]) (08/02/89)
>From article <3876@shlump.nac.dec.com>, by gentile@horsey.dec.com (Sam Gentile): > Allthough I love C and do most of my work in C, I don't think it is a > good first language to learn for a student. I found C very difficult at first > and I had a lot of problems with it. Also some C coders write code that is > completely un-readable. I think BASIC should be abolished and certainly not > taught as a first language. It gives people very bad first habits. I don't > think FORTRAN is a good first language either. I think PASCAL is still the > ideal first language for a student. It will expouse the student to the > concepts of pointers in a more friendly way than C and teach structured > programming habits. Gee, where is Bill Wolfe when you need him? I want to see some arguments that ADA is the best language to teach first! :-) ------------------------------------------------------------------------------- Bob Terek "If I'm not swinging from the rafters, Software Consultant I'm not having fun!"
mattias@emil (Mattias Waldau) (08/02/89)
At our four year long computer science education (Master) given since -81 we have always had Lisp as the first language. First it was MacLisp, now it is Common Lisp. The textbook we use is "Anatomy of Lisp" by Allen. We tried Abelsson and Sussman one year but the students didn't like that book. Prolog is the second language, Prolog and Lisp are used for most programming exercises, except where low-level languages like assembler and C is needed (e.g. OS). But to the point: Now and then we discuss using Prolog first, an algorithmic language isn't actually needed until the students meet the three books of Knuth. The difference between clean programming in Lisp and Pascal is just syntax, the approach to solve a programming task is the same. If the students can Lisp then they learn Pascal, C, Ada within weeks. But they are of course not professional programmers, that takes at least a year.
reggie@dinsdale.nm.paradyne.com (George W. Leach) (08/02/89)
In article <MCCALPIN.89Aug1155412@masig3.ocean.fsu.edu> mccalpin@masig3.ocean.fsu.edu (John D. McCalpin) writes: >In <see above> reggie@dinsdale.nm.paradyne.com (George W. Leach) writes: >> Many studies of programmer behavior have indicated that the >>novice programmer thinks in terms of a specific language syntax, while >>with experience comes more abstract thinking without worring about >>implementation language details. [...] >The trouble occurs when the poor students spend too many years with >abstraction and forget that the purpose of the exercise is to solve a >problem --- and that has to be done in a specific language with a >specific syntax. The expression of the solution to the problem must be expressed in a specific language for the purpose of implementing it on a machine. And, yes that is the ultimate goal. However, one does not use the implementation language during the steps prior to implementation, eg. requirments, specification, design, etc.... Unless one is just a hacker :-) The expression of the design of the solution need not be written down in that implementation language. >A sad example of this was the "GOTO" war last year (or maybe the year >before) in the Communications of the ACM. In a patronizing letter, an >eminent computer scientist (whose name will remain unmentioned) gave >the "correct" solution to a simple problem that had been batted back >and forth as an example of a construct that was made *easier* to read >with a GOTO statement. The problem was that the "solution" was not >written in an existing programming language, but in a very obscure >pseudo-code. Sometimes the use of pseudo-code is useful in order to express a solution in the form of a programming language, but without paying attention to all of the particular syntactic details of an implementation language. This allows one to concentrate on the solution method rather than the syntax in which that solution is expressed. For example, if one wishes to communicate the solution to a file merge problem, the interesting aspects of the solution revolved around the algorithm. We are not all that interested in how files are opened, read from, written to, checked for EOF, closed, etc...... We can express these concepts in some abstract notation and translate to the implementation language later. The attention will be focused in on the actual solution loop. The idea is that the solution that was provided may be transcribed into an appropriate implementation language. While I agree that once one reaches this point you might as well use the implementation language to write code, there is a use for pseudo-code. How often do you refer to books on algorithms? Would you prefer one that expresses the algorithms in what ever language happens to be in vogue today or one that expresses the algorithm in a generic manner that transcends language? George W. Leach AT&T Paradyne (uunet|att)!pdn!reggie Mail stop LG-133 Phone: 1-813-530-2376 P.O. Box 2826 FAX: 1-813-530-8224 Largo, FL USA 34649-2826
billms@dip.eecs.umich.edu (Bill Mangione-Smith) (08/02/89)
>Sam Gentile gentile%horsey.dec@decwrl.dec.com >Software Engineer decwrl!horsey.dec.com!gentile >Digital Equipment Corp >Software Services Engineering - Network Engineering >5 Wentworth Drive GSF1-1/G13, Hudson NH 03051 >----------------------------------------------------------------------------- > DEC is Number ONE in Networking!!! >----------------------------------------------------------------------------- >The views expressed are my totally my own and do not reflect the views of >Digital Equipment Corp. So then what is the _official_ view of Digital Equipment Corp.? I guess they believe IBM is number one in networking? :) bill mangione-smith advanced computer architecture lab university of michigan you know, we're the guys who do software ------------------------------------------------------------------------------ I officially represent the opinions of Bo Schembechler and Steve Fisher, but not Bill Freider or Bud Middaugh.
eugene@eos.UUCP (Eugene Miya) (08/03/89)
Well gee, this debate again...... Time to go, but some observations since the last time I read this (before .newsrc needed reconstructing) 0) Resolved in the past that multilingual environments were the way to go. 1) It's been interesting talking to a physicist about LISP ["Why would any one want to use a language like this?..."]. He does see value in it now. But still thinks we're crazy. 2) We took on a HS summer student (interested in fluid dynamics not CS, but knew BASIC). He said that his BASIC was carried over into his vector Fortran and C (lots of tight loops). The point being that hardware can influence thinking as much as software. And people use computers for performance as well as flexibility. 3) I feel sorry for any student who has to learn Pascal as a first language these days. [considering my old X3J9 days] Time for some one to start up a cron file to post resolution on this frequently asked discussion. You're damned if you do, and damned if you don't. Another gross generalization from --eugene miya, NASA Ames Research Center, eugene@aurora.arc.nasa.gov resident cynic at the Rock of Ages Home for Retired Hackers: "You trust the `reply' command with all those different mailers out there?" "If my mail does not reach you, please accept my apology." {ncar,decwrl,hplabs,uunet}!ames!eugene Live free or die.
sergio@squid.rtech.com (Sergio Aponte) (08/03/89)
I believe PASCAL. This is the order I learned my languages in : PL/C, PL/1, BASIC, FORTRAN, COBOL, GMAP (assembly on Honeywell), PASCAL and C . The only reason I learned PASCAL was because it was required to take a Data Structures class. Is important to point out that the only ones I have used on a non-college environment are COBOL and C. But, by the time I got to C, it was all a matter of syntax. Data structures, problem solving approach and structured programming were elements that can be applied to them all. PASCAL is easier on the teacher since the rules bring programs to look more alike (easier to correct). It allows recursion, which was a very important and hard concept to grasp. It allows the student to get closer to what is going on in the computer than COBOL or BASIC would. For years I swore by PL/1, until PASCAL, and then C came along. I believe that the student MUST learn assembly sooner or later, if not to use it ever again, because it gives the student the insight and an understanding of what is going on behind the crt. But by all means, don't start with it. After a few years even my mother learned BASIC and COBOL (true story), but it was the problem solving approach and understanding of structure that never were thought to her. PASCAL would had opened her eyes to the more complex world behind the linear solution. --- This has to be my opinion. Nobody else agrees with it!! Sergio. ;-{D ------------------------------------------------------------------------------- | Internet: sergio@squid.rtech.com Sergio L. Aponte, MTS @ RTI | | UUCP: {sun,mtxinu,pyramid,pacbell,hoptoad,amdahl,cpsc6a}!rtech!squid!sergio | -------------------------------------------------------------------------------
darin@nova.laic.uucp (Darin Johnson) (08/03/89)
IMVHO, there are two types of beginning 'programming' classes. The first type is the type often taught at community colleges, or in math or engineering. The purpose of this type of course is prepare the student for programming in the real world, or for later classes that will require programming. The second type is the first (or nearly first) class in computer-science. In this type, the student is expected to learn programming concepts, with less emphasis on silly details. For the first type of course, a language should be chosen that will probably be used later in the students career, or that is used on the departments computers. For an engineering department, FORTRAN may be a good choice, since it will likely be used in later courses, and will inevitably be used after graduation. For a general course, probably Pascal, since it is widely available. Students who get good at Pascal can always move up to Modula-II later in life to get real work done. C will probably confuse the issue. Programming languages likely to be run into in later careers (non-CS careers) will probably look similar to Pascal, such as DBaseIII, etc. Stuff like Scheme, Smalltalk, etc., are probably a bad way to go. Remember, the majority of these students will not be able to just pick up a new language and learn it. For the second type of course, you should pick a language that enforces or encourages what you are trying to teach. Students who finish this course will probably be expected to learn a variety of languages, so they should learn techniques that will help no matter what they use later. IMHO, this class should be taught concurrently with a basic CS course (boolean logic, high-level computer organization, etc.). The language chosen should be powerful enough to get simple jobs done with little coding (no PL/I here), demonstrate higher level techniques simply (such as recursion, sets, possibly pointers), and have user definable data structures. Pascal would also be good for this, despite the bad rap it gets. Pascal would also go good for students going into AI also (I see a bunch of AI researchers that are lost when they have to use a traditional language). Scheme would be a good choice. It doesn't hurt to use a language that won't be used in other courses, since it gives a broader view of languages. Definately, don't allow students to graduate having used only one language, or they will get the impression that that language is the best tool for all jobs. In my beginning class, I learned UCSD Pascal (at UCSD of course :-). The thing I liked best looking back, were the builtin 'turtle'-graphics. This allowed us to get reasonably complex programs done before learning how to get around IO. We learned recursion early on, by drawing fractal trees, etc. You could see on the screen what recursion did, and where you went wrong. Later, I was a proctor/assistant in that class, and in another class that used Pascal on non-graphics terminals. When teaching recursion, the students with the graphics learned it much easier than those who only had a bunch of numbers as output. Darin Johnson (leadsv!laic!darin@pyramid.pyramid.com) We now return you to your regularly scheduled program.
paco@sbcs.sunysb.edu (Francisco J Romero) (08/03/89)
Standard ML is an excellent first language. It belongs to the Scheme family, but the parenthesis were left out. The language is interactive, strongly-typed, polymorphic, supports several levels of data abstraction, and its syntax is very, very friendly. Our freshmen here at Stony Brook love it. ML exists for Vaxes, Suns and Macs. A version for the IBM PC is in progress. Paco. Paco Romero "There was preserved in her Dept. of Computer Science, the fresh miracle SUNY at Stony Brook, NY 11794 of surprise" [paco@sbcs.sunysb.edu] (516)689-6953 -Jim Morrison
genesch@aplvax.jhuapl.edu (Eugene Schwartzman) (08/03/89)
In article <1095@kuling.UUCP> mattias@emil (Mattias Waldau) writes: > >But to the point: Now and then we discuss using Prolog first, an >algorithmic language isn't actually needed until the students meet the >three books of Knuth. The difference between clean programming in Lisp >and Pascal is just syntax, the approach to solve a programming task is >the same. > Agreed, but how do you explain to the student what a matrix is in Lisp. >If the students can Lisp then they learn Pascal, C, Ada within weeks. >But they are of course not professional programmers, that takes at >least a year. Disagree, once you learn Lisp, it is very difficult to change to a more restrictive language like you mentioned. On the other hand it is very easy to go from Pascal, C, Ada to Lisp since you already know how to 'think' in a language taht has many different types and it is possible to simulate those types in Lisp. If you learn Lisp first, you wont encounter things like 'records', you will learn to use lists to do it and when you switch to Pascal/C/Ada you will simulate lists as arrays and it becomes a mess. I know of what I speak, I have programmed in Assembly, C, Pascal, Lisp, Prolog, FORTRAN, COBOL (ugh!!! :-( ), BASIC (double ugh!!!), and dabbled in ADA (tripple ugh!!!! - wouldn't wish it on my worst enemy) My choice would be Pascal since everything you can do in any other language you can do in Pascal - yes even low level bit manipulation, though it's not in the standard, but an offshoot of it. It allows you to be as restrictive as you like it and at the same time gives you enough freedom to manipulate things without watchdog eye like ADA or loose like C. Even has recursion for the Lisp-lovers. I feel that pascal is the ideal language as the first language because after that people can choose their own paths - Lisp, Prolog, etc for higher level, or C, Assembly, etc for OS stuff. Would you want to teach Lisp or Prolog to a future OS writer? I wouldn't and shouldn't need to. gene schwartzman genesch@aplvax.jhuapl.edu _______________________________________________________________________________ | GO BEARS, GO CUBS, GO WHITE SOX, GO BULLS, GO BLACKHAWKS, GO TERPS !!!!! | | Soccer is a kick in the grass (and sometimes on astroturf)! | | GO DIPLOMATS, GO STARS, GO BAYS, GO BLAST !!!! | | CFL -> GO EDMONTON ESKIMOS!!!! VFL -> GO CARLTON BLUES !!!! | |_____________________________________________________________________________| Disclaimer: These are my opinions and not of my employer.
jon@hanauma (Jon Claerbout) (08/03/89)
AWK ! I'm just an engineer (National Academy) who does a lot of coding in C, Ratfor (Fortran), and tinkering with C++, but I use AWK in system administration and I think AWK is a super language for beginners, because it is "great for real problems with natural language." Also, learning AWK, you learn C-like syntax, and also the UNIX environment. Further, the AWK textbook (Aho, Whoever, and Kernighan (AWK)) is outstanding! It is available in both interpreted and compiled forms. Against AWK, I admit that the language may not be available on micros, and the new version does not seem to be widespread, although it is readily available. If you are seriously interested in what language to teach first, be sure to look at the AWK book.
ken@aiai.ed.ac.uk (Ken Johnson) (08/03/89)
In article <5407@ficc.uu.net> peter@ficc.uu.net (Peter da Silva) writes: >Some people swear by LOGO or some other very high level language. I think >a wide variety of such languages should be presented to students, but >only after they're familiar with conventional data and control structures. I think this depends on what you are training the students to do. There is nothing you can do with conventional control structures that you can't do with recursion: for example, printing numbers from 1 to N can be done with to print_them :n if :n = 0 [stop] print_them :n - 1 print :n end On the other hand if you really want a WHILE statement you could define to while :condition :command if run :condition [run :command while :condition :command] end and then use that definition in a procedure: to print_them :n local "i make "i 1 while [:i <= :n] [print :i make "i :i + 1] end I happen to think the first print_them has a grace and beauty that the looping version lacks, but that's my taste. -- Ken Johnson, AI Applications Institute, 80 South Bridge, Edinburgh EH1 1HN E-mail ken@aiai.ed.ac.uk, phone 031-225 4464 extension 212 `I have read your article, Mr. Johnson, and I am no wiser than when I started.' -- `Possibly not, sir, but far better informed.'
gillies@p.cs.uiuc.edu (08/06/89)
I learned languages in this order: ----- High School ---- TUTOR (PLATO system ~ FORTRAN, later taught it to younger kids) BASIC (later taught it to younger kids) PASCAL (informal summer course) C (helping college student test a compiler) ------ College ------- PL/C (UI intro -- first programming course) CLISP+ALGOL (MIT intro - "Structure & Interp. of computer programs") FORTRAN (summer job) CLU (MIT software engineering language), MODULA-II (in my compiler implementation course) ------- Real World ----- MESA (at Xerox) I believe I am a well-educated programmer .. So my conclusion is: Frankly, it doesn't matter much. Pick something reasonable (like C or Pascal) that is used in the real world. Pascal is ideal because it is pampers the beginners, but you can also write an OS or compiler in Pascal. Pascal is a true multi-purpose language. C is a failure for beginners, even if they are forced to use LINT, because the error diagnostics are lousy (the compiler frequently cannot pinpoint the location of an error) and I am under the impression that this is a fundamental limitation in the syntax design of the C language. Don't confuse the success of "Structure & Interpretation of Computer Programs" with the language SCHEME. The authors might well have chosen Smalltalk, or even C++ as their implementation language. Some of the neat things in that book (at least the draft copy on my bookshelf) SIMPLY DON'T WORK in other languages. Also, realize that MIT is biased towards producing AI researchers. Does your school have this mission? Perhaps not. Perhaps another language would serve your students better.
tbc@hp-lsd.HP.COM (Tim Chambers) (08/09/89)
|From gillies@p.cs.uiuc.edu Sat Aug 5 16:38:00 1989 |................................................ Also, realize that |MIT is biased towards producing AI researchers. I must record my disagreement with this statement. Like gillies (whatever his or her real name is), I also got a degree from MIT. My diploma reads "Computer Science and Engineering" (VI-3, for the number fanatics in the audience :-). I feel that I came out very well-prepared for a software engineering career and not well-prepared to be an AI programmer. The curriculum is split between "traditional" software engineering topics (6.170 and 6.035) and AI (6.034, etc.). (A few EE courses are thrown in just in case we have to deal with hardware someday -- an excellent idea, IMHO.) But the intent was *not* to teach students how to become AI programmers. It just so happens that the MIT CS department believes AI is a significant part of computer science. I'm responding here to try to prevent readers from concluding that Sussman and Abelson are teaching AI just because they are from MIT and use a LISP language dialect to embody the concepts they teach in their book. As I said in my first posting -- the language doesn't matter. Teach the *concepts*. Since you have to pick one first, use SCHEME. It's good enough for MIT freshmen. |From jon@hanauma Thu Aug 3 00:15:02 1989 |AWK ! I didn't see the :-) in the posting, but I still can only respond with a hearty hardy har, har. (I like AWK, too. I've even seen AWK used for AI! :-)
mccaugh@s.cs.uiuc.edu (08/09/89)
For anyone interested in keeping tally, of the preceding 24 responses to this note, PASCAL was the clear winner, with ML faring the worst.
lacey@batcomputer.tn.cornell.edu (John Lacey) (08/10/89)
In article <207100002@s.cs.uiuc.edu> (2609 of comp.edu) mccaugh@s.cs.uiuc.edu writes: } For anyone interested in keeping tally, of the preceding 24 responses to } this note, PASCAL was the clear winner, with ML faring the worst. Well, I have been very interested and I have also been privy to many direct E-mail responses (being the one who asked the question), and in 75 responses I show a much different pattern. The earlier responses are, in general, more coherent (I prefer x, yah, yah, yah), whereas the more recent posting have degenerated into my-language-is-better-than-yours arguments. However, of those expressing a clear preference, in both the comp.edu and comp.lang.misc newsgroups (oh, do I wish I had used follow-up), and in personal mail messages there are a total of 41, broken down as follows: Pascal 13 Scheme 12 Modula-2 4 Ada 3 ML 3 Eiffel 2 Prolog 2 AWK 1 Turing 1 In addition, at least 2 of those voting (?) for Pascal explicitly mentioned features not in standard Pascal, but which are among the major additional features of Modula-2 (it's closest relative), among these are separately compilable modules, open and dynamic arrays, and strong type checking (e.g., between enumerated types). So, I have to disagree with both ends of the previous posting. Pascal was not a clear winner, nor did ML fare at all badly. Also, several languages were mentioned, either as runner-ups or simply as interesting possibilities. Modula-2 (runner-up to Pascal) was the most common of these. Also, Emacs Lisp was mentioned by several people, as was CLU, and also SmallTalk. I would like to thank everyone for their contribution. There real conclusion I have reached from all of the responses has been that these numbers, and the particular languages especially, are not and should not be the focus. I still intend to post a what-I-learned message, probably in the next couple of days. Cheers, -- John Lacey lacey@tcgould.tn.cornell.edu cornell!batcomputer!lacey After August 16: jjlacey@owucomcn.bitnet If you have to, try mdl@sppy00.UUCP or maybe {...}!osu-cis!sppy00!mdl
gillies@m.cs.uiuc.edu (08/10/89)
/* Written 12:02 pm Aug 8, 1989 by tbc@hp-lsd.HP.COM in m.cs.uiuc.edu:comp.edu */ |From gillies@p.cs.uiuc.edu Sat Aug 5 16:38:00 1989 |................................................ Also, realize that |MIT is biased towards producing AI researchers. I must record my disagreement with this statement. Like gillies (whatever his or her real name is), I also got a degree from MIT. My diploma reads "Computer Science and Engineering" (VI-3, for the number fanatics in the audience :-). I feel that I came out very well-prepared for a software engineering career and not well-prepared to be an AI programmer. The curriculum is split between "traditional" software engineering topics (6.170 and 6.035) and AI (6.034, etc.). (A few EE courses are thrown in just in case we have to deal with hardware someday -- an excellent idea, IMHO.) But the intent was *not* to teach students how to become AI programmers. It just so happens that the MIT CS department believes AI is a significant part of computer science. I'm responding here to try to prevent readers from concluding that Sussman and Abelson are teaching AI just because they are from MIT and use a LISP language dialect to embody the concepts they teach in their book. As I said in my first posting -- the language doesn't matter. Teach the *concepts*. Since you have to pick one first, use SCHEME. It's good enough for MIT freshmen. |From jon@hanauma Thu Aug 3 00:15:02 1989 |AWK ! I didn't see the :-) in the posting, but I still can only respond with a hearty hardy har, har. (I like AWK, too. I've even seen AWK used for AI! :-) /* End of text from m.cs.uiuc.edu:comp.edu */
gillies@m.cs.uiuc.edu (08/10/89)
/* Written 12:02 pm Aug 8, 1989 by tbc@hp-lsd.HP.COM in m.cs.uiuc.edu:comp.edu */ > |From gillies@p.cs.uiuc.edu Sat Aug 5 16:38:00 1989 > |................................................ Also, realize that > |MIT is biased towards producing AI researchers. > > I must record my disagreement with this statement. Like gillies > (whatever his or her real name is), I also got a degree from MIT..... > [doubts MIT is geared towards producing AI graduates] Now that several people have questioned me, and it's time to elaborate. Like nearly all MIT students who spend a small fortune on their education, I kept all my course notes and assignemtns. Here is a sampling of the homework problems / labs (there are two assignments per week in 6.001 -- a homework & a lab): PS 1 -- prefix, predicting lisp evaluator, using "defun". PS 2 -- predicting lisp evaluator, box & pointer diagrams PS 3 -- change for $1, predicting recursion, list filter/transformation PS 4 -- association lists, semantic networks PS 5 -- queues, triplets PS 6 -- control abstraction, lisp evaluator, free vars, evaluation PS 7 -- passing functions around, closures, random nums, tape data structure PS 9 -- making lisp evaluator codeable in assembly LAB 2 -- set operations, sets as binary trees, LAB 3 -- ELIZA LAB 4 -- varieties of graphical search (depth, breadth) LAB 5 -- Implementing a LISP evaluator Lab 6 -- queueing system event-based simulation Lab 8 -- How to implement MACSYMA Lab 9 -- bubble sorting in ALGOL Lab 10 -- implementing LISP structures in ALGOL Quiz 1 -- basics, bacteria image processing, algebraic manip (MACSYMA) Quiz 2 -- Bank accounts (closures), lisp evaluator Quiz 3 -- using simulated LISP from ALGOL final -- evaluator basics, box & pointer diagrams, implementing permutations, converting a number to ascii, simulating recursion with stacks, doing MAPCAR in ALGOL. I have to laugh at the ALGOL part of this course. Ever try to read LISP written in ALGOL? Let me tell you, octal i432 machine code is probably more legible. Apparently, these ALGOL programmers have a serious fear of FOR/WHILE loops -- every piece of code is recursive! Let me call your attention to the AI nature of PS4, PS9, LAB3, LAB4, LAB5, LAB8, and all the perverse ALGOL assignments. On the other hand, problems that you might find in a "normal" intro to C.S. course occurs in PS3, PS7, LAB2, LAB9, QUIZ2 (bank accounts), FINAL (converting a number to ASCII). In other words, more than half the course was devoted to arguably AI-type problems. Furthermore, good documentation practices WERE NEVER EVEN MENTIONED IN 6.001. This is why I stated that MIT's intro course was geared towards producing AI students. It certainly has major holes compared to the courses at other schools. I think I said this, but it makes good sense to pick a language that will be reused in upper level courses. Clearly LISP is a great choice for MIT, since one upper-level AI course is required, several other AI-type course are "restricted electives", and students might do UROP (undergrad research), or write their UG thesis in LISP. MIT didn't have this good sense. They tried to teach a compiler course without requiring the students to know the implementation language. It's hard to learn 2 new languages and also compiler technology in one semester! Don Gillies, Dept. of Computer Science, University of Illinois 1304 W. Springfield, Urbana, Ill 61801 ARPA: gillies@cs.uiuc.edu UUCP: {uunet,harvard}!uiucdcs!gillies
faustus@fir.Berkeley.EDU (Wayne A. Christopher) (08/11/89)
In article <1095@kuling.UUCP> mattias@emil (Mattias Waldau) writes: > ... The difference between clean programming in Lisp > and Pascal is just syntax, the approach to solve a programming task is > the same. I really have to disagree with this. Thinking in a declarative language such as Prolog is quite different from thinking in a procedural language. In Prolog, you are encouraged to write down assertions about the problem, and operations that are very basic to procedural programming, such as modifying the value of an object, are seldom needed and extremely awkward when they are needed. Probably it is easier to become good with Prolog if you haven't learned a procedural language first, and perhaps vice-versa. Wayne
crcraig@athena.mit.edu (Christopher R Craig) (08/11/89)
In article <4200022@m.cs.uiuc.edu> gillies@m.cs.uiuc.edu writes: >I think I said this, but it makes good sense to pick a language that >will be reused in upper level courses. Clearly LISP is a great choice >for MIT, since one upper-level AI course is required, several other >AI-type course are "restricted electives", and students might do UROP >(undergrad research), or write their UG thesis in LISP. > >MIT didn't have this good sense. They tried to teach a compiler >course without requiring the students to know the implementation >language. It's hard to learn 2 new languages and also compiler >technology in one semester! I don't know when this was, but it certainly is no longer the case. The laboratory portion of 6.035 is done in CLU, which we had to learn in 6.170 (software engineering laboratory). Back to the "6.001 is biased towards AI people" debate. I don't really agree with that. I've got my old problem sets too, and they don't look that bad. Is an adventure "game" an AI hack? How about impelementing run-time type checking in the metacircular evaluator? If you ask me, the problem sets just reinforce the abstraction and computational concepts that Abelson & Sussman teaches. All I know is, 6.001 didn't make me want to become an AI person. No way. Not a chance :-) ---------------------------------------- Chris Craig MIT '89 crcraig@athena.mit.edu
twl@brunix (Theodore W. Leung) (08/11/89)
Don Gilles writes: > MIT didn't have this good sense. They tried to teach a compiler > course without requiring the students to know the implementation > language. It's hard to learn 2 new languages and also compiler > technology in one semester! In all fairness, the curriculum at MIT has changed a bit since you were there. The required laboratory in software engineering (6.170) presents the language (CLU) needed for the compiler construction class. Also, documentation and good design are stressed in the software engineering lab. -------------------------------------------------------------------- Internet/CSnet: twl@cs.brown.edu | Ted Leung BITNET: twl@BROWNCS.BITNET | Box 1910, Brown University UUCP: uunet!brunix!twl | Providence, RI 02912
hugo@sunapee.dartmouth.edu (Peter Su) (08/13/89)
In article <207100002@s.cs.uiuc.edu> mccaugh@s.cs.uiuc.edu writes: > > For anyone interested in keeping tally, of the preceding 24 responses to this > note, PASCAL was the clear winner, with ML faring the worst. I think this is a sad testimony to the state of CS education. Having seen intro students suffer through courses in Pascal, I can say that I think the language is just too complicated. Pascal courses spend all their time teaching people Pascal and *not* concepts useful for programming (or computer science) in general. Here are somethings you can't teach in Pascal 1) General procedural absraction...procedure are not first class objects. 2) Data abstraction 3) The whole idea of virtual machines and the relationship between the language and the computer. Of course, maybe I'm biased, because Scheme provides good mechanisms for all of these concepts, and Abelson and Sussman's book is organized around these lines. I happen to think this is the right way to do things, you may disagree with me. Computer science (and progrmaming) is about abstraction, and every program that we use presents us with a vitual machine representing a different abstraction. I think this is the important thing for people to learn, and once you have learned it, you can apply the idea independently of the language that you happen to be using in the real world. But, in learning these ideas, you have to have a language that makes the expression of them as easy as possible, and I think Scheme/Lisp does for a few reasons: 1) Simple syntax. Lisp's syntax takes a day to learn, maybe less. In comparison, at CMU they use complex structure editing systems to try and shield the students from Pascal's complex syntax. And Pascal is pretty simple compared to, say, Ada or C++. 2) Simple semantics. You don't have to teach two different kinds of paramter passing. You don't have to teach them about pointers, allocating memory, records and all that nonsense before they can do useful things. I think the only semantic problem with Lisp is the idea of a special form, i.e. a form that does not evaluate its arguments, but that can be covered pretty easily. 3) Interpreters are good for easy debugging. 4) Functions are first class objects. Student written abstractions become part of the language just as naturally as the builtin abstractions. 5) I/O is easy. Teaching people how to do I/O in Pascal is a pain in the &*^%$. Anyway, I've rambled long enough. Keep in mind that these are just the ideas of a mixed up grad. student who knows very little about teaching in general. I just want to see more intro the CS classes that are about programming and not about Pascal/Fortran/whatever. Pete hugo@sunapee.dartmouth.edu
gds@spam.istc.sri.com (Greg Skinner) (08/13/89)
In article <13419@bloom-beacon.MIT.EDU> crcraig@athena.mit.edu (Christopher R Craig) writes: >In article <4200022@m.cs.uiuc.edu> gillies@m.cs.uiuc.edu writes: >>MIT didn't have this good sense. They tried to teach a compiler >>course without requiring the students to know the implementation >>language. It's hard to learn 2 new languages and also compiler >>technology in one semester! > >I don't know when this was, but it certainly is no longer the case. >The laboratory portion of 6.035 is done in CLU, which we had to learn >in 6.170 (software engineering laboratory). 6.035 (the compiler course) underwent major revision from 1982 to 1983. In 1982, it was taught by Mike Hammer. There was no lab (ie. no requirement to actually write a compiler or any portions of one). Assignments consisted of essay-type questions concerning various problems of compiler design and implementation. Exams were multiple-choice with penalties for wrong guesses. (A few unfortunate people wound up with negative scores.) In 1983, after Mike Hammer left, they restructured the course to implement the scanner, parser, semantic checker and code generator in CLU on TOPS-20. They did not announce this change until the semester before the course was offered, so some students did not have an opportunity to fit 6.170, the course where the CLU language is used to implement software projects of various sizes, into their schedules. Other people have probably posted on this in the past, but frankly I thought 6.035 that semester was a nightmare (and not just because CLU was required -- in fact, I rather liked programming in it). I believe the teaching staff was just as unprepared to cope with a compiler lab as the students were to take it. There were many complex administrative problems that cropped up during the course, such as many students dropping forcing the lab groups to be rearranged, overloaded computer facilities, insufficient staffing, conflicts with other subjects, etc. They took pity on us and allowed us to switch to pass/fail if we wished. >Back to the "6.001 is biased towards AI people" debate. I don't >really agree with that. My personal opinion of 6.001 is largely independent of whether or not it trains for future AI work. I found it to be entirely too fast-paced for a freshman subject (and this is speaking from the perspective of someone who taught a seminar on it one IAP). There are a lot of deep concepts introduced here that most colleges do not touch until junior or senior year. It is great if you are the type of person that can keep up with the pace but very distressing if you cannot. I don't know what MIT's undergraduate CS curriculum these days is, but the last time I looked you could take 5 courses either using AI tools or covering AI concepts (Abelson & Sussman, AI, problem-solving paradigms, machine vision, and robot manipulation), as opposed to one in operating systems, one in compilers, one in computer architecture, none in databases, and none in numerical analysis. There is a definite bias towards AI, and perhaps not enough towards other parts of CS. --gregbo MIT '84
crcraig@athena.mit.edu (Christopher R Craig) (08/14/89)
In article <24691@joyce.istc.sri.com> gds@spam.istc.sri.com (Greg Skinner) writes: >Other people have probably posted on this in the past, but frankly I >thought 6.035 that semester was a nightmare (and not just because CLU >was required -- in fact, I rather liked programming in it). I believe >the teaching staff was just as unprepared to cope with a compiler lab >as the students were to take it. There were many complex >administrative problems that cropped up during the course, such as >many students dropping forcing the lab groups to be rearranged, >overloaded computer facilities, insufficient staffing, conflicts with >other subjects, etc. They took pity on us and allowed us to switch to >pass/fail if we wished. Heck, I just thought it was plain *hard*. The logistical problems were the least of my worries, although the load on Deep-Thought (the TOPS20 machine) was still lousy. They moved it to Project Athena now, so it's a lot better. What I thought was hard was the combination of rather difficult theoretical material (parsing and error handling esp.) combined with a lot of non-trivial programming in a nominally 12-unit course. I'm glad I took it, 'cause now I sort of know what goes on inside a compiler, but it was tough. >I don't know what MIT's undergraduate CS curriculum these days is, but >the last time I looked you could take 5 courses either using AI tools >or covering AI concepts (Abelson & Sussman, AI, problem-solving >paradigms, machine vision, and robot manipulation), as opposed to one >in operating systems, one in compilers, one in computer architecture, >none in databases, and none in numerical analysis. There is a >definite bias towards AI, and perhaps not enough towards other parts >of CS. Ok, maybe the curriculum *can* be biased toward AI, but it doesn't have to be. Except for the first 2 (and I don't think Abelson & Sussman is AI), all the rest are restricted electives that you don't have to take. I know of very few people who took any of the latter 3; most take either the probability course or the algorithms course. There are also plenty of other classes around to take. One of the more interesting ones I took was 6.313 (Contemporary Computer Design), taught by Tom Knight. I'm willing to bet that's as good an undergrad course in computer architecture as there is at most schools. If MIT is biased toward anything, it's toward producing CS theorists. They don't concentrate on producing software engineers. ---------------------------------------- Chris Craig MIT '89 crcraig@athena.mit.edu
gillies@p.cs.uiuc.edu (08/15/89)
/* Written 10:36 am Aug 11, 1989 by twl@brunix in p.cs.uiuc.edu:comp.edu */ > Also, documentation and good design are stressed in the > software engineering lab. In other words, documentation is taught as an "advanced concept", not appropriate for the beginning course. I'm sorry, but I still find this attitude to be an incredible crock, and I'm shocked that it still persists at my Alma Mater. Don Gillies, Dept. of Computer Science, University of Illinois 1304 W. Springfield, Urbana, Ill 61801 ARPA: gillies@cs.uiuc.edu UUCP: {uunet,harvard}!uiucdcs!gillies