[comp.sys.mac.programmer] Need help programming SCC

marykuca@sol.uvic.ca (Brent Marykuca) (01/30/91)

Hello there,

We are trying to set up a fairly high-resolution timing device based
on the Time Manager's ability to run a task every millisecond.  Our
task simply increments a global counter and it seems to give reasonable
accuracy.  Our difficulty lies in trying to find an input device that
we can use which will give us nearly-immediate response.  We have tried
using the keyboard, but the ADB manager only seems to report keypresses
every 11ms or so, which is apparently not good enough.  Our plan now is
to wire up some sort of device to the serial port which will cause an
interrupt when one of two buttons is pressed, and then have our interrupt
handler read the clock for us.

So, what I'm looking for is somebody who has done some low-level serial
port interrupt handler stuff with the SCC, or any tips at all about how
to approach this project.  Can anybody help?

Cheers,

Brent Marykuca (marykuca@sol.uvic.ca)
Apple Research Partnership Program
University of Victoria, BC, Canada
-- 

Brent Marykuca (marykuca@sol.UVic.CA)
Apple Research Partnership Program
Computing User Services

jchoi@ics.uci.edu (John Choi) (02/01/91)

In article <1991Jan29.205455.26919@sol.UVic.CA> marykuca@sol.uvic.ca (Brent Marykuca) writes:
>
>We are trying to set up a fairly high-resolution timing device based
>on the Time Manager's ability to run a task every millisecond.  Our
>task simply increments a global counter and it seems to give reasonable
>accuracy.  Our difficulty lies in trying to find an input device that
....
>to wire up some sort of device to the serial port which will cause an
>interrupt when one of two buttons is pressed, and then have our interrupt
>handler read the clock for us.
>
>So, what I'm looking for is somebody who has done some low-level serial
>port interrupt handler stuff with the SCC, or any tips at all about how
>to approach this project.  Can anybody help?
>

   I have a similar set-up now using just the normal serial buffer functions
in the Serial Driver Manager.  I have a square pulse generated from a TTL
chip go directly to pin 5 of the modem port with pin 4 as ground.

The program sets up the modem port for regular serial communication at
some arbitrary baud rate.  When the data comes into the port it is stored
in a data buffer.  I clear the data buffer after each 'character' transmission
and store the time away.  My problem is that I can't seem to get Time Manager
to work correctly.  When I set up a loop to increment a global variable
every 5 or 10 counts, each incremented vaule seems to correspond to 1.05 msec
over a range of minutes.  Do you just recalibrate to match real time or
is there a real solution?  I'm using THINK C 4.02 on a Mac II.
    If you don't have this problem, could you send me some of your timing
routines.  Thanks.

Here is the Serial Port stuff - not really complete.

timeSignals()
{
  long     len, config;
  short    inPort;
  char     inBuf[10];
  int      onLine = 1;

  OSErr    anErr;
  EventRecord   anEvent;

  anErr = RAMSDOpen(sPortA);
  inPort = -6;                                  /* Modem port = -6 */
  config = baud9600+data8+stop10+noParity;
  anErr = SerSetBuf(inPort, &inBuf, 10);
  anErr = SerReset(inPort, config);

  while(onLine) {
    anErr = SerGetBuf(inPort, &len);
    if(len){
       anErr = FSRead(inPort, &len, &inBuff);
       timeNow = TickCount();
       if(len > 1) DoErrText("Transfer Buffer Overflow");
    }

    if(GetNextEvent(mDownMask,&anEvent)) onLine = 0;
  }

  RAMSDClose(sPortA);
  return;
}

aeh@mentor.cc.purdue.edu (Dale Talcott) (02/04/91)

In article <27A8D317.5428@ics.uci.edu> jchoi@ics.uci.edu (John Choi) writes:
jc>In article <...> marykuca@sol.uvic.ca (Brent Marykuca) writes:
bm>>We are trying to set up a fairly high-resolution timing device based
bm>>on the Time Manager's ability to run a task every millisecond.
...
jc>  When I set up a loop to increment a global variable
jc>every 5 or 10 counts, each incremented vaule seems to correspond to 1.05 msec
jc>over a range of minutes.  ...  I'm using THINK C 4.02 on a Mac II.

Only off by 5%.  You're lucky!  On a Mac Plus, a Time Manager PrimeTime()/
timer-task call pair takes about 0.2 to 0.3 ms, so trying to provide the
"millisecond accuracy" enamored of psychology types was loads of fun.  I
ended up using TickCount as an accurate counter of 1/60.15'ths of seconds
and then used TimeManager "milliseconds" to subdivide those ticks.  The
accuracy was thus +0 -5ms, irrespective of total duration.  On a II, you
should be able to get +0 -2ms.

An article by Lawrence D'Oliveiro says:
ld> According to my interim Inside Mac vol VI dated October 1990, the
ld> tmCount field is "set by RmvTime".

Wish I'd had IM VI!  Looks like the code just got lots simpler.  Not being
a trusting soul, though, I'll verify that several run-every-ms tasks don't
louse up the reported time for a long-duration task.  Does anyone know if
the VIA timer keeps running after it interrupts at the end of a Time
Manager interval, or is there "dead" time between the end of one interval
and when the Time Manager starts timing the next?  I suspect this is the
case, since one version of my code used two 2ms tasks running 1ms out of
phase with each other.  It didn't help.

Dale Talcott, Purdue University Computing Center
aeh@j.cc.purdue.edu, {purdue,pur-ee}!j.cc.purdue.edu!aeh, aeh@purccvm.bitnet