dab@oswego.Oswego.EDU (Dave Bozak) (02/07/89)
What is the general attitude toward turing as the next language to use in intro cs classes? and more generally, as a substitute that enables a one language approach towards classes in data structures, file processing, assembler, and other "core" courses (which would minimize the learning curve when moving to C, C++, Euclid, ...). dave bozak computer science SUNY College at Oswego dab@rocky.oswego.edu
lea@ziebmef.uucp (Simon Peter Lea) (02/14/89)
In article <1059@oswego.Oswego.EDU> dab@oswego.Oswego.EDU (Dave Bozak) writes: >What is the general attitude toward turing as the next language to use >in intro cs classes? and more generally, as a substitute that enables >a one language approach towards classes in data structures, file >processing, assembler, and other "core" courses (which would minimize >the learning curve when moving to C, C++, Euclid, ...). > > >dave bozak >computer science >SUNY College at Oswego >dab@rocky.oswego.edu In my opinion, to the average CS undergrad, Turing is just Pascal with modules. Very similar to Turbo Pascal (4.0 on PC and MAC) - which is of course similar to Modula, etc. Having used it for the past three years myself as an undergrad here at U of T, I can vouch for its elegance and how simple it is to learn initially. This probably has a great deal to do with the fact that we were taught Pascal in High School. But the general feeling I get from my fellow students is that it doesn't really make that much difference whether you know Pascal before-hand or not. The language has been used here at U of T, so I understand, for some systems programming tasks that are normally tackled in C. For example one of my Profs, has written a small UNIX like O/S (similar in spirit to MINIX) called Mtunis. It was written entirely in Turing Plus, which is a concurrent language much like Concurrent Euclid. These are just my general comments, I can't really say one way or the other whether Turing is definitely the second best language to use in CS undergrad courses -- I am biased. :-) -- "HAL would have passed the Turing test with ease." -- A.C.C., 2001 Simon Lea -- University of Toronto, Department of Computer Science UUCP: {utzoo!telly,utgpu!lsuc!ncrcan}!ziebmef!lea lea@ziebmef.UUCP InterNet, BITNET, CSnet, NetNorth, CDNNET, EARN: lea@gpu.utcs.utoronto.ca
pattis@june.cs.washington.edu (Richard Pattis) (02/17/89)
In article <1989Feb13.182342.2022@ziebmef.uucp>, lea@ziebmef.uucp (Simon Peter Lea) writes: > > > In my opinion, to the average CS undergrad, Turing is just Pascal with > modules. Very similar to Turbo Pascal (4.0 on PC and MAC) - which is of > course similar to Modula, etc. Having used it for the past three years > myself as an undergrad here at U of T, I can vouch for its elegance > and how simple it is to learn initially.... I'd like to suggest that Turing is a much better thought out language than either Pascal or Modula-2. It may look like both of these with a few changes and additions, but the similarities at the syntactic level hide the big advantages of Turing at the semantic level. Take parameter modes. They look just like P/M-2, but there is a big semantic difference. They behave like Ada's IN, IN OUT modes. That is all parameters are transmitted in O(1); the restriction on non-VAR parameters is that the body of the subprogram cannot change them (checked and flagged at compile time). So what's the big difference? Look in a random Pascal book at binary searching of arrays. In 50% of the books you will see the array passed as a non-VAR parameter, which means that it will be copied, which means that the subprogram runs in O(N) not O(Log N) - if the authors implement it recursively, its O(N Log N). Why do authors choose non-VAR, because the array doesn't change. Pascal forces you to choose a mode in an awkward way. What happens in P/M-2 if you accidentally omit VAR. The item is copied, changed internally, and then goes away - causing very strange problems at run time. This cannot happen in Turing; forgetting VAR and trying to change the parameter yields a compile time error. So there is more than meets the eye in Turing. Rich Pattis