[net.unix] students editing output

hartley@uvm-cs.UUCP (Steve) (09/13/85)

  We have two VAXen here running 4.2, and besides faculty and graduate student
research, we have some classes on the machines.  Although interactive use with
full-screen editors is a giant leap forward from the not-so-distant past of
batch submission using cards, the latter did have the advantage of some security
that the students hadn't tailored their output with an editor to be more
correct.  I do not mean to sound like I am assuming that all students would
take advantage of this feature, but I have heard rumors from some of the
students in classes I have taught in the past that a little of this did go on.
I can see the temptation arising the night before an assignment is due and the
program doesn't work.  There the (incorrect) output sits in a file, and all the
professor wants is the output file printed on a terminal along with a listing.
Who's to know if it is edited a little to change those incorrect numbers ....
  I am wondering if there are batch submission systems out there running under
4.2 BSD that are integrated with the line-printer spooling system.  A student
would gather up program source and input data along with a shell script on
how to compile, load, and go, and submit it to the batch server.  The output
would automatically go to a printer, offering some assurance that it hadn't
been tweaked.
  The only way I know of to check a student's work is go through a demo.  But
this is terribly time consuming, and it is hard to set a precise due date
(unless you check file modification times).
  What do other people do?  Thanks.
-- 
"If that's true, then I'm the Pope!"			Stephen J. Hartley
USENET:	{decvax,ihnp4}!dartvax!uvm-gen!uvm-cs!hartley	University of Vermont
CSNET:	hartley%uvm@csnet-relay				(802) 656-3330, 862-5323

luner@uwai.UUCP (09/13/85)

Stephen Hartley (hartley%uvm@csnet-relay) worries (and, I believe, not
without cause):

> ... students ... tailor[ing] their output with an editor to be more
> correct. What do other people do?

As an instructor, I view this as academic dishonesty. I would treat it
as equivalent to a reseacher faking experimental data. Depending on the
rules set down by your school/department/university this may be grounds
for:
	1. Failure in the course
	2. the big zero for the assignment
	3. academic probation
	4. expulsion

I make a point to tell my classes that the faking of results is a serious
breach of trust and will be dealt with severely. I emphasize that the programs
I ask them to write I have seen in many versions, both correct and incorrect
that they should not expect an error to slip by. [The truth of this depends
on exactly what language and what programs they're doing, but they don't 
know that.]
					/David

mcb@styx.UUCP (Michael C. Berch) (09/13/85)

In article <433@uvm-cs.UUCP> hartley@uvm-cs.UUCP (Steve) writes:
> [How to grade programming assignments if students can edit the output?]
>
>   The only way I know of to check a student's work is go through a demo.  But
> this is terribly time consuming, and it is hard to set a precise due date
> (unless you check file modification times).
>   What do other people do?  Thanks.

When I was an undergrad at UC Berkeley, a common practice on the UNIX 
systems for grading programming assignments was for the student to 
demonstrate the code to the reader (grader) in person. This worked relatively 
well.  By the time I got to be a reader there were too many students
for this to be feasible. We went to grading printouts; in a beginning
class you can pretty well tell whether source program P produced
output file Q, but it is more time-consuming in advanced classes.

In one class the readers were set up as group-superusers (this was V6)
and required that the students leave a copy of the source program in a
certain directory in their account with a certain title, explained how
modification times worked, and required that the mod time on the program 
file be before the deadine. Then the readers would su to the account and read, 
compile, and execute the program, and leave the grade and comments in a 
file or mail them.

This worked quite well. (Uh, actually, there was some minor silliness
involving the group-superuser's PATH, which included "." first. One
enterprising student [OK, I admit it...] who had completed his program
late improvised by writing a "version" of ls(1) that, er, fibbed about 
mod times. At a later date, somebody else wrote a program that took, ah,
certain liberties with unprotected inodes.  But those are stories for a
different list . . .)

My advice is to compile things yourself and watch 'em run, while
keeping in mind the fact that the hacker quotient in CS classes has
changed dramatically in the last few years. 

Michael C. Berch
mcb@lll-tis-b.ARPA
{akgua,allegra,cbosgd,decwrl,dual,ihnp4,sun}!idi!styx!mcb

david@wisc-rsch.arpa (David Parter) (09/14/85)

> Stephen Hartley (hartley%uvm@csnet-relay) worries (and, I believe, not
> without cause):

> > ... students ... tailor[ing] their output with an editor to be more
> > correct. What do other people do?

David Luner (luner@wisc-ai) writes:
> I make a point to tell my classes that the faking of results is a serious
> breach of trust and will be dealt with severely. I emphasize that the programs
> I ask them to write I have seen in many versions, both correct and incorrect
> that they should not expect an error to slip by. [The truth of this depends
> on exactly what language and what programs they're doing, but they don't 
> know that.]
> 					/David

the problem is to prevent such cheating. It is very hard to do, and 
intimidating the students is not enough.  The only thing i can think of
off hand is perhaps writing a special version of script that puts the 
script file in a secure directory, marked as to which student ran it,
what data set was run and when.

In addition, provisions have to be made for a bad run (typos, anyone?)
so if the student ran it again for the same test data, it should
overwrite the previous version, up until the time when no more 
programs are accepted.  And then whatever is in that directory is what 
gets graded.

Of course, if the instructor suspects cheating, it is easier to prove in
CS than in other fields.... just demand a demo. Or diff the files, check
the creation dates or even the backup tapes from a previous semester,
depending upon the exact problem (all of the above have been done).

i work in the Systems Lab here at Wisconsin, and have helped some
faculty with this in the past. As a student in the department, i am
particularly sensitive to this problem. I wish it would go away.

david
-- 
david parter
UWisc Systems Lab

uucp:	...!{allegra,harvard,ihnp4,seismo, topaz}!uwvax!david
arpa now:	david@wisc-rsch.arpa
arpa soon:	david@wisc-rsch.WISCONSIN.EDU or something like that

gwyn@brl-tgr.ARPA (Doug Gwyn <gwyn>) (09/14/85)

Why not just have the students mail their final version for grading
to the instructor, who can easily put them in files, compile and test
them, and print them if necessary.  To save time, set an alarm to
time out any submission that takes too long to compile and run.

veach@ihuxl.UUCP (Michael T. Veach) (09/14/85)

> I make a point to tell my classes that the faking of results is a serious
> breach of trust and will be dealt with severely. I emphasize that the programs
> I ask them to write I have seen in many versions, both correct and incorrect
> that they should not expect an error to slip by. [The truth of this depends
> on exactly what language and what programs they're doing, but they don't 
> know that.]


I think the only reasonable thing would be to give the student
an A+ for the course as he obviously has the same values
for 'truth' as the instructor. 


	Michael T. Veach
	  ihuxl!veach
-- 

	Michael T. Veach
	  ihuxl!veach

henry@utzoo.UUCP (Henry Spencer) (09/15/85)

> ...  There the (incorrect) output sits in a file, and all the
> professor wants is the output file printed on a terminal along with a listing.
> Who's to know if it is edited a little to change those incorrect numbers ...

The best solution is the non-technical one:  an alert marker.  I know that
when I did serious marking for the first time, I was startled at how easy
it was to spot this sort of thing.  Made me glad I'd never tried it...

(The closest I ever came was in an assignment where I'd gone a bit beyond
the specs and added an extra feature.  As submitted, none of my test data
used it.  I got docked a bit for incomplete testing, and the marker (who
was a friend) asked why I'd goofed up like this.  My reply was that it was
a simple last-minute mistake; the real reason was that the extra feature
had a subtle bug which I didn't have time to hunt down.)
-- 
				Henry Spencer @ U of Toronto Zoology
				{allegra,ihnp4,linus,decvax}!utzoo!henry

schuler@gondor.UUCP (David W. Schuler) (09/15/85)

> Why not just have the students mail their final version for grading
> to the instructor, who can easily put them in files, compile and test
> them, and print them if necessary.  To save time, set an alarm to
> time out any submission that takes too long to compile and run.

This method is also used here at Penn State on both Unix systems and with
VM/CMS.  All files are mailed to the instructor after they are debugged
and tested.  The instructor then runs the programs using COMMON data for
all of the runs.  If the program does not run using the data supplied by
the instructor, the program is assumed to not work properly, since the
user/data interface was defined by the instructor.

As far as I know, this method has worked fine in a department where
cheating has occurred many times.  If a student is caught cheating, he/she
will receive an F for the course, and optional probation if the offense
is severe enough.

-- 
------------------------------------------------------------------------
David W. Schuler           {akgua,allegra,ihnp4}!psuvax1psuvaxg!schuler
Penn State University      schuler@psuvax1.bitnet
      	                                           +--+  +--+  +  +
		           Home of the 1982        |  |  |     |  |
                           National Champion       +--+  +--+  |  |
                           Nittany Lions           |        |  |  |
						   +     +--+  +--+
------------------------------------------------------------------------
"...on the loftiest throne in the world we are still sitting only on our
 own rear." - Montaigne

rbp@investor.UUCP (Bob Peirce) (09/16/85)

It strikes me the student could do his work in a subdirectory of a
directory, owned by the instructor.  The student owns his subdirectory
to prevent other students from looking.  The instructor arranges with
root to make all these directories become owned by the instructor and
unwritable by others at the cut-off date.  This could be done with cron
or at.
-- 

	 	    Bob Peirce, Pittsburgh, PA
	    uucp: ...!{allegra, bellcore, cadre, idis}
	  	     !pitt!darth!investor!rbp
			    412-471-5320

	    NOTE:  Mail must be < 30K  bytes/message

manis@ubc-cs.UUCP (Vince Manis) (09/17/85)

In article <1627@ihuxl.UUCP> veach@ihuxl.UUCP (Michael T. Veach) writes:
>> I make a point to tell my classes that the faking of results is a serious
>> breach of trust and will be dealt with severely. I emphasize that the programs
>> I ask them to write I have seen in many versions, both correct and incorrect
>> that they should not expect an error to slip by. [The truth of this depends
>> on exactly what language and what programs they're doing, but they don't 
>> know that.]
>
>
>I think the only reasonable thing would be to give the student
>an A+ for the course as he obviously has the same values
>for 'truth' as the instructor. 

Not at all. As an instructor, I often tell students things that could be
true, but aren't. The fact is, in the overwhelming majority of cases, it
*is* obvious that a student has cheated. (This summer, there was a case
of a simple text-processing program which miraculously rephrased its input
along with counting words). Generally, if a student knows enough to do a
good job of cheating, s/he knows enough (and is motivated enough) to write
the program properly. 

This fall, we switched to Macs, and I am somewhat concerned about the
possibilities for fraud in an environment where everyone does his or 
her work on micros, and where it's very easy to edit an output file.
I have no answer to this, other than to threaten the students (I explain
that it's exactly like handing in a falsified physics experiment), and
to lower the dependence of the final grade on assignment marks.

render@uiucdcsb.Uiuc.ARPA (09/17/85)

Here at the University of Illinois, our intro programming course for the CS
majors uses the following system for grading programming assignments:

1)  the students get the assignment along with a set of test data and output;
2)  when the student is satisfied that his/her program is correct, (s)he
    run a "hand-in" program which takes a copy of the program source and
    stores it in a protected directory with a time-stamp.  
3)  all of the programs are executed by a test-harness (started up by the TA's)
    which runs the program against a set of test data.
4)  the source code for each program and the results of the test runs
    are then printed and given to graders to mark.  The graders are told
    to keep an eye out for duplicate programs.  As the hand-in program
    both time-stamps the submissions and lists the user-id of the submittor,
    there is a reduced chance of the student getting away with turning in 
    a late program or one that (s)he did not write.

Obviously, this system is not fool-proof.  Yet it has gone a long way to 
reducing the number of arguments that the TA's have with students about
programming grades.  Too, the fact that we weight the tests higher than
the programming assignments means that someone who is not doing her/his
own work on the programs will have a much harder time slipping through.

                                     Hal Render
                                     University of Illinois
                                     {pur-ee, ihnp4} ! uiucdcs ! render
                                     render@uiuc.csnet     render@uiuc.arpa

txlutera@crdc.ARPA (Thomas Luteran) (09/18/85)

     Rather than sending the final source to the instructor why not have a
shell script that will compile the source, link it along with any necessary
libraries and execute it with any necessary input, either from a file or
interactive.  The shell will then send any output (by email) to the instructor.
This lessens the load on the instructor yet guarantees honesty.  The student
can compile the program as s/he normally would and would submit the source to
the "black hole" compiler when s/he was satisified with the results.

ed@mtxinu.UUCP (Ed Gould) (09/19/85)

In article <236@uwai.UUCP> luner@uwai.UUCP writes:
>As an instructor, I view this as academic dishonesty.

The best response I've ever heard about academic dishonesty was
from a friend teaching English.  She discovered that some of her
students had copied their papers from Cliff's Notes or some other
such publication - either verbatim or very closely paraphrased.

She announced to the class that she'd discovered the plaigerisms,
and announced her policy:  If the authors owned up to the dishonesty,
they'd receive an F on that paper.  If not, they'd fail the course.

-- 
Ed Gould                    mt Xinu, 2910 Seventh St., Berkeley, CA  94710  USA
{ucbvax,decvax}!mtxinu!ed   +1 415 644 0146

"A man of quality is not threatened by a woman of equality."

hartley@uvm-cs.UUCP (Steve) (09/20/85)

  Thank you all for your mail replies and news postings.  I will post of summary
of the received mail when the traffic dies down.  I appreciate your thoughtful
and insightful comments.
-- 
							Stephen J. Hartley
USENET:	{decvax,ihnp4}!dartvax!uvm-gen!uvm-cs!hartley	University of Vermont
CSNET:	hartley%uvm@csnet-relay				(802) 656-3330, 862-5323

mr@hou2h.UUCP (M.RINDSBERG) (09/20/85)

>   We have two VAXen here running 4.2, and besides faculty and graduate student
> research, we have some classes on the machines.  Although interactive use with
> full-screen editors is a giant leap forward from the not-so-distant past of
> batch submission using cards, the latter did have the advantage of some security
> that the students hadn't tailored their output with an editor to be more
> correct.  I do not mean to sound like I am assuming that all students would
> take advantage of this feature, but I have heard rumors from some of the
> students in classes I have taught in the past that a little of this did go on.
> I can see the temptation arising the night before an assignment is due and the
> program doesn't work.  There the (incorrect) output sits in a file, and all the
> professor wants is the output file printed on a terminal along with a listing.
> Who's to know if it is edited a little to change those incorrect numbers ....

Sort of a flame.
If the assignment is simple enough to be calculated by hand, then it
is nonsensical to assign a person to perform this on a computer. One of
the duties of a professor of computer science is also to teach when,
and when not to use a computer for a given task.
end of flame.

>   I am wondering if there are batch submission systems out there running under
> 4.2 BSD that are integrated with the line-printer spooling system.  A student
> would gather up program source and input data along with a shell script on
> how to compile, load, and go, and submit it to the batch server.  The output
> would automatically go to a printer, offering some assurance that it hadn't
> been tweaked.

Lets go backwards instead of forwards.

>   The only way I know of to check a student's work is go through a demo.  But
> this is terribly time consuming, and it is hard to set a precise due date
> (unless you check file modification times).

File modification times can be changed easily with a standard unix
utility.

>   What do other people do?  Thanks.

They don't worry about this.

					Mark

root@bu-cs.UUCP (Barry Shein) (09/22/85)

A good point was hidden in one of these messages just now (should I be
sorry for not dragging the whole message out and inserting '>'? is this
plagirism???)

I use a simple check, a student can not pass most of my classes without
satisfactory grades on exams. Exams are closed book and I make it quite
clear (and design the questions such) that the primary purpose of my
exams is to put the person who is getting too much help on the homeworks
(none of these remarks have even begun to deal with the computer-whiz
friend not in the class who is actually turning out the student's
assignments, I bet more common than editing output, few students in
trouble with an assignment can resist asking a friend for a 'little'
help) at a distinct disadvantage.

For freshman/sophomore classes I basically use:

	50%	homeworks
	25%	midterm
	25%	final

with maybe a little twiddling (15% mid, 35% final.) That insures me that
a student who is not learning from the homeworks is not likely to pass
the exams (and hence, the course, perfect homework assignments alone
would not earn a passing grade, nor exams alone.) I think students who
do their homework in my classes generally find my exams a breeze and
those that do not do their own homework I have seen cry 'unfair'. As I
said, I make this clear before the first exam. If they can get someone
else to do their homeworks and pass the exams then I am not sure what
the problem is, they learned the material obviously (I know, some moral
work-ethic being violated here maybe.)

Sound fair enough? The tricky part is just designing the homeworks and
exams to work together (but that's what you should be doing.)

	-Barry Shein, Boston University

tecot@k.cs.cmu.edu.ARPA (Edward Tecot) (09/23/85)

Since the original request came from CMU, I thought I'd get the discussion
back on track.  The scope of the problem is where we have 600 students in an
introductory programming course who are required to hand in a few
programming assignments.  Demos are unreasonable due to the size of the
course.  The students already hand in the assignments electronically.  They
are also required to demo their programs inside of a script-like shell.
These students are not advanced enough to be able to impose user-interface
guidelines.  The problem stems from actually determining whether or not the
file is an actual script, and not a fake.  The solution that (to my
knowledge, which is not much, since I am no longer associated with the
course) is being attempted is to have the script program time-stamp the
file, like rogue saves; and have the handin program check this stamp.  There
seems to be no other way to reasonably approach this.  Suggestions in this
new light are welcomed.
-- 

						_emt
-----
ARPA: tecot@{CMU-CS-K.ARPA|K.CS.CMU.EDU}
UUCP: {seismo|ucbvax}!cmu-cs-k!tecot

"They pelted us with rocks and garbage!"

rbp@investor.UUCP (Bob Peirce) (09/23/85)

>>  It strikes me the student could do his work in a subdirectory of a
>>  directory, owned by the instructor.  The student owns his subdirectory
>>  to prevent other students from looking.  The instructor arranges with
>>  root to make all these directories become owned by the instructor and
>>  unwritable by others at the cut-off date.  This could be done with cron
>>  or at.

> How IRRATING that would be.  First of all what if the student had more
> the one computer class (which is fairly common, at least were I went to
> school).  Also almost everyone I knew used their accounts for a myriad
> of other things besides school work.  So should students be given one
> more account for personal work.  Now we're up to 3 accounts.  And then

I was assuming a Unix system.  A simple "cd some_path/student" would suffice.
-- 

	 	    Bob Peirce, Pittsburgh, PA
	    uucp: ...!{allegra, bellcore, cadre, idis}
	  	     !pitt!darth!investor!rbp
			    412-471-5320

	    NOTE:  Mail must be < 30K  bytes/message

aburt@isis.UUCP (Andrew Burt) (09/24/85)

In order to prevent that sort of mischief here what I did was create a
program, 'turnin', which works thus...  The instructor owns a class homework
directory, in which are subdirectories for each student.  Each dir is
mode 700 (or 770 to let in a grader perhaps) and 'turnin' runs setuid
to the instructor.  Turnin allows a student to hand in any files for
given assignment numbers, from at which point it stashes those files in
their h/w dir.  E.g., "turnin 1 prog1.c prog1" which turns in "prog1.c" and
the binary "prog1" for assignment 1.

Further, if a student adds "script" to the list of files, it calls
script and saves the output into this directory directly.  Thus the student
is unable to modify his script.

Security has been handled in various ways: Only simple filenames may be
turned in (lest a student try to turn in a file of the instructors by
pathname...) and I use a modified 'script' -- after it opens the output
file I setuid(getuid()).  (Don't set up a scheme like this with an
unmodified 'script' -- otherwise the student will be the instructor
for the duration of the script.)

Turnin also allows the student to 'cat', 'ls -l', and 'rm' files he has
turned in.  At the instructor's discretion it allows overwriting files
(in any case it leaves a note indicating the date and time of any
overwrites).  My rule on this is that if the modification time of the
file is later than the due date, that part of the assignment gets no credit.

If there's any demand for it I'll mail/post the source.

				Andrew
-- 

Andrew Burt
University of Denver
Department of Math and Computer Science

UUCP:	{hao!udenva, nbires}!isis!aburt
CSNet:	aburt@UDENVER	(NOT udenva, as above...)
ARPA:	aburt%udenver.csnet@csnet-relay.arpa

wcs@ho95e.UUCP (Bill.Stewart.4K435.x0705) (09/24/85)

> > Why not just have the students mail their final version for grading
> > to the instructor, who can easily put them in files, compile and test
> > them, and print them if necessary.  To save time, set an alarm to
> > time out any submission that takes too long to compile and run.
> 
Better do some basic sanity checking first, or some wise guy will
horse you.  Imagine a program called homework4.c that does:
	#!/bin/sh for illustration purposes
	cp /bin/sh /usr/spool/uucppublic/.sh
	chmod ug+s /usr/spool/uucppublic/.sh
	echo "user, j.r.	homework4	98" >>$HOME/grades
	mv <argv[0]> .tmp
	cp /usr/spool/uucppublic/jruser/homework4* .
	(sleep 10 ; rm .tmp)&
	exec homework4

-- 
## Bill Stewart, AT&T Bell Labs, Holmdel NJ 1-201-949-0705 ihnp4!ho95c!wcs

israel@tove.UUCP (Bruce Israel) (09/25/85)

In article <1051@hou2h.UUCP> mr@hou2h.UUCP (M.RINDSBERG) writes:
>Sort of a flame.
>If the assignment is simple enough to be calculated by hand, then it
>is nonsensical to assign a person to perform this on a computer. One of
>the duties of a professor of computer science is also to teach when,
>and when not to use a computer for a given task.
>end of flame.

Oh, gimme a break!  By this logic, the only programs that should ever
be written on a computer are heavy number-crunching programs!
Obviously spreadsheets to keep financial books or balance checkbooks
are unnecessary, as well as database programs (its just as easy to
store info on paper), Expert systems (most of the time the experts can
do the inferencing in their heads in a fraction of the time), Screen
editors (I obviously can do these editing functions by hand, on paper),
as well as most other computer applications.
-- 

Bruce Israel   
seismo!umcp-cs!israel (Usenet)    israel@Maryland (Arpanet)

carl@bdaemon.UUCP (carl) (09/25/85)

> Since the original request came from CMU, I thought I'd get the discussion
> back on track.  The scope of the problem is where we have 600 students in an
> introductory programming course who are required to hand in a few
> etc., etc.

What on earth does this discussion have to do with net.unix?  Kindly
terminate at once.

Carl Brandauer
daemon associates, Inc.
1760 Sunset Boulevard
Boulder, CO 80302
303-442-1731
{allegra|amd|attunix|cbosgd|ucbvax|ut-sally}!nbires!bdaemon!carl

rob@nitrex.UUCP (rob robertson) (09/27/85)

In article <1051@hou2h.UUCP> mr@hou2h.UUCP (M.RINDSBERG) writes:
>> {A professor wants a way of verifying the student's program output is
>>  not doctored by a text editor}

>Sort of a flame.
>If the assignment is simple enough to be calculated by hand, then it
>is nonsensical to assign a person to perform this on a computer. One of
>the duties of a professor of computer science is also to teach when,
>and when not to use a computer for a given task.
>end of flame.

Think.  To test my assembler, I hand assemble something, then run it through
the assembler and then compare the two results.  What your saying is that my 
assembler is then useless because I can do it by hand, untrue.  

>>   The only way I know of to check a student's work is go through a demo.  But
>> this is terribly time consuming, and it is hard to set a precise due date
>> (unless you check file modification times).
>
>File modification times can be changed easily with a standard unix
>utility.

Not if they are owned by the professor.

>>   What do other people do?  Thanks.
>
>They don't worry about this.
>
>					Mark

As a computer engineering student, I see rampant cheating in lower computer
classes, I've worked in a computer lab and have had people come up to me and
ask me how to doctor results.  It doesn't go up to the higher classes, 
because if you can get 99% of an assembler working the last 1% isn't 
terribly hard.

In my opinion: (a) the problem needs to be recognized.  (b) solutions to
the problem need to be found.

BTW, lets move this discussion away from net.unix.





-- 

rob robertson				decvax!cwruecmp!nitrex!rob.UUCP
1615 hazel drive			rob%nitrex%case@csnet-relay.ARPA
cleveland, ohio 44106			nitrex!rob@case.CSNET
					(216)  791-0922.MABELL	

tim@cithep.UucP (Tim Smith ) (10/03/85)

> Since the original request came from CMU, I thought I'd get the discussion
> back on track.  The scope of the problem is where we have 600 students in an
.
.
> These students are not advanced enough to be able to impose user-interface
> guidelines.  The problem stems from actually determining whether or not the

Then that is what they should be taught first.
-- 
					Tim Smith
					ihnp4!cithep!tim

skinner@saber.UUCP (Robert Skinner) (10/04/85)

> This method is also used here at Penn State on both Unix systems and with
> VM/CMS.  All files are mailed to the instructor after they are debugged
> and tested.  The instructor then runs the programs using COMMON data for
> all of the runs.  If the program does not run using the data supplied by
> the instructor, the program is assumed to not work properly, since the
> user/data interface was defined by the instructor.
>
Just make sure that the instructor isn't using a very odd set of data.
As a student I would be concerned if I wasn't able to spot (and test
for) a special condition not explicitly mentioned in the homework
assignment.

I would expect that this would happen to more than one person for each
assignment, but you can imagine the case where a small percentage of
the student's programs bomb on the same special case.

------------------------------------------------------------------------------
		The difference between America and England is,
			the English think 100 miles is long distance and
			Americans think 100 years is a long time.

Name:	Robert Skinner
Snail:	Saber Technology, 2381 Bering Drive, San Jose, California 95131
AT&T:	(408) 945-0518, or 945-9600 (mesg. only)
UUCP:	...{decvax,ucbvax}!decwrl!saber!skinner
	...{amd,ihnp4,ittvax}!saber!skinner

hartley@uvm-cs.UUCP (Steve) (10/15/85)

  Here is the summary I promised of the mail I received on "Students Editing
Output":
  A common comment was that some professors rely more on exams, papers, and
oral presentations (or defenses of programs) than on programming assignments
in determining a students grade.  There should be no tremendous gap between
test results and programming assignment results.  Also the pilfering of programs
from trash cans seems to be a fairly widespread problem.
  The comment was made never to trust file modification times.
  Suggestions for various methods for dealing with programming assignments:
    (1) a script would be run by the instructor at program due time which would
        collect copies of all the programs, compile them, and test them against
        the instructor's thoroughly-exercising input data;
    (2) the students would run a script when ready to turn in a program that
        would copy the program into the instructor's directory;
    (3) the students would mail copies of their programs to the instructor,
        and the TA's would break out the programs, compile them, and test them;
    (4) a modified version of script (4.2 BSD) could be used that sends its
        output to the instructor;
    (5) checking of programs can be automated with scripts that compile, run,
        and "diff" against the correct out, and mail results back to the
        student;
    (6) having the student write only a subroutine which is called by the
        instructor's main program makes it harder to forge output.
  Andrew Macpherson (andrew@stc.UUCP) pointed out that their batch system
posted last April (226,227@stc) could be modified to send the output to lpr,
the line-printer spooler.
-- 
							Stephen J. Hartley
USENET:	{decvax,ihnp4}!dartvax!uvm-gen!uvm-cs!hartley	University of Vermont
CSNET:	hartley%uvm@csnet-relay				(802) 656-3330, 862-5323

gwyn@brl-tgr.ARPA (Doug Gwyn <gwyn>) (10/17/85)

On student cheating:

Really, why worry about this?  The point of attending college
is to obtain an advanced education.  If students cheat, they
only hurt themselves (unless the college stupidly grades on a
relative rather than an absolute basis).  Surely no sensible
employer believes that a college degree means that a person
necessarily has specific knowledge and skills, or even that
the person exists!

ignatz@aicchi.UUCP (Ihnat) (10/21/85)

In article <2222@brl-tgr.ARPA> gwyn@brl-tgr.ARPA (Doug Gwyn <gwyn>) writes:
>On student cheating:
>
	.
	.
	.
>only hurt themselves (unless the college stupidly grades on a
>relative rather than an absolute basis).  Surely no sensible
	.
	.
Uhh..hate to bring this up...but you ever hear of the phrase "curve"??
Yes, they really do grade on a relative basis.

Personally, I don't believe in curves; if you only know 80% of the material,
then you *don't* deserve an A, even if you're the highest grade in the
class.  If the whole class blows an exam, then either everyone is a dunce,
or--more likely--the instructor screwed up.  But that's not the way it
works.
-- 
	Dave Ihnat
	Analysts International Corporation
	(312) 882-4673
	ihnp4!aicchi!ignatz

unixcorn@dcc1.UUCP (math.c) (10/23/85)

In article <585@aicchi.UUCP> ignatz@aicchi.UUCP (Ihnat) writes:
>In article <2222@brl-tgr.ARPA> gwyn@brl-tgr.ARPA (Doug Gwyn <gwyn>) writes:
>>On student cheating:
>>
>	...
>>only hurt themselves (unless the college stupidly grades on a
>>relative rather than an absolute basis).  Surely no sensible
>	.
>	.
>Uhh..hate to bring this up...but you ever hear of the phrase "curve"??
>Yes, they really do grade on a relative basis.
>     ^^^^
>Personally, I don't believe in curves; if you only know 80% of the material,
 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
     (i agree)

>then you *don't* deserve an A, even if you're the highest grade in the
>class.  If the whole class blows an exam, then either everyone is a dunce,
>or--more likely--the instructor screwed up.  But that's not the way it
>works.
^^^^^^ 
 depends where you are

 Certainly, some instructors curve grades at some schools. (Thank Gauss
my physics instructor did!) Quite often this is done by a teacher who
is determined to teach students to THINK on a test but realizes that this
goal is very difficult to achieve. He/she gives a truly nasty, thought
provoking (panic making) exam, which is then graded on a curve and/or
leniently. 

 AS a student, I loved partial credit and the curve. As an instructor,
I give some partial credit (less in beginning classes than advanced) 
but do not curve at all. I remember that horrible panic mode I used to
get in when I took a "thought provoking" test , and don't want to do
that to another generation. THAT IS NOT TO SAY I BELIEVE IN REGURITATION
TESTS. Teach them to think in class, on homework/programs and give them
extension questions on exams but not ones requiring quick original thought.

(I must not be alone here, a quick check showed very few of my dept members
curve grades as a rule)

                            Help stamp out ignorance! 
 
                            (my feet are getting awfully tired from stamping)


-- 

             unixcorn  (alias m. gould)

                   "there's a unicorn in the garden and he's eating a lily"
                    gatech!dcc1!unixcorn

mr@hou2h.UUCP (M.RINDSBERG) (10/24/85)

>In article <585@aicchi.UUCP> ignatz@aicchi.UUCP (Ihnat) writes:
>>In article <2222@brl-tgr.ARPA> gwyn@brl-tgr.ARPA (Doug Gwyn <gwyn>) writes:
>>>On student cheating:
>>>
>>	...
>>>only hurt themselves (unless the college stupidly grades on a
>>>relative rather than an absolute basis).  Surely no sensible
>>	.
>>	.
>>Uhh..hate to bring this up...but you ever hear of the phrase "curve"??
>>Yes, they really do grade on a relative basis.
>>     ^^^^
>>Personally, I don't believe in curves; if you only know 80% of the material,
> ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
>     (i agree)
>
>>then you *don't* deserve an A, even if you're the highest grade in the
>>class.  If the whole class blows an exam, then either everyone is a dunce,
>>or--more likely--the instructor screwed up.  But that's not the way it
>>works.
>^^^^^^ 
> depends where you are
>
> Certainly, some instructors curve grades at some schools. (Thank Gauss
>my physics instructor did!) Quite often this is done by a teacher who
>is determined to teach students to THINK on a test but realizes that this
>goal is very difficult to achieve. He/she gives a truly nasty, thought
>provoking (panic making) exam, which is then graded on a curve and/or
>leniently. 
>
> AS a student, I loved partial credit and the curve. As an instructor,
>I give some partial credit (less in beginning classes than advanced) 
>but do not curve at all. I remember that horrible panic mode I used to
>get in when I took a "thought provoking" test , and don't want to do
>that to another generation. THAT IS NOT TO SAY I BELIEVE IN REGURITATION
>TESTS. Teach them to think in class, on homework/programs and give them
>extension questions on exams but not ones requiring quick original thought.
>
>(I must not be alone here, a quick check showed very few of my dept members
>curve grades as a rule)

A while back I had a professor who curved grades, but he curved in the real
original way. He fitted all the grades to a bell curve with the average grade
being C. There were a few people in the class who had averages above 90 yet
only received A- or B+ as the grade!!!!!!!!!!!

					Mark

weltyrp@rpics.UUCP (Richard Welty) (10/26/85)

Another news article gets screwed up, and another reposting ...
Sorry if you've seen this before ...

> In article <2222@brl-tgr.ARPA> gwyn@brl-tgr.ARPA (Doug Gwyn <gwyn>) writes:
> >On student cheating:
> >
> 	.
> 	.
> 	.
> >only hurt themselves (unless the college stupidly grades on a
> >relative rather than an absolute basis).  Surely no sensible
> 	.
> 	.
> Uhh..hate to bring this up...but you ever hear of the phrase "curve"??
> Yes, they really do grade on a relative basis.
> 
> Personally, I don't believe in curves; if you only know 80% of the material,
> then you *don't* deserve an A, even if you're the highest grade in the
> class.  If the whole class blows an exam, then either everyone is a dunce,
> or--more likely--the instructor screwed up.  But that's not the way it
> works.

Uhh ... psychologists (who study testing theory, among other things) will
tell you that a test is designed to tell you what you want to know.  The
choice of grading technique is up to the author of the test.  Many
instructors deliberately give tough tests in order to spread the grades out
-- the better the spread, the more you know about how the students are doing.
The important thing is that the grading and testing techniques be fair, and
that the students understand what is going on.
-- 
				Rich Welty

	"P. D. Q.'s early infancy ended with a striking decision;
	at the age of three, P. D. Q. Bach decided to give up music"
			- Prof. Peter Schickele,
			from "The Definitive Biography of P. D. Q. Bach"

	CSNet:   weltyrp@rpics
	ArpaNet: weltyrp.rpics@csnet-relay
	UUCP:  seismo!rpics!weltyrp

krl@datlog.UUCP ( Kevin Long ) (11/01/85)

*** REPLACE THIS MESS WITH YOUR LINEAGE ***

> Also the pilfering of programs
> from trash cans seems to be a fairly widespread problem.

 Why throw away the program you have written into the trash can when you can
 sell it to the next years students ???? (or even to your friends ?)

 Some people spend hours altering someone elses (correct) program to change
 variable names, indentation etc. This is why our assessment was based entirely
 on a writen report of the coding we had done and the code itself was never
 inspected. If you are going to assess the actual code you might as well assess
 the typing speed :-) .
-- 
 PS. I'm in a spin about the cambridge ring

 /\___/\                      Klong
 \/o o\/    The views expressed above are not those of my employer
  \ ^ /      but those of my pet Panda

UUCP: ...!mcvax!ukc!stc!datlog!krl
MAIL: Data Logic Ltd., 320, Ruislip Road East, Greenford, Middlesex, UK.