[comp.sys.mac.programmer] LSC 3.0 "Context" + Passing parameters

west@bnrmtv.UUCP (andrew west) (07/23/88)

-----------------------------------------------------------------

I have a couple of questions regarding Lightspeed C (or is it now
THINK C? -- I "think" I like "Lightspeed" better).

1) I finally received my upgrade to 3.0 and have been playing around
   with the source level dubugger (what else? :-) ).  I am very
   impressed with it but am not entirely clear on the subject of
   "context" as it relates to variables displayed in the data window.

   When you want to display a variable in the data window, you must
   click on a line in your source which contains that variable and
   then type the variable name (does cut/paste work?) in the data
   window.  According to the LSC 3.0 manual, this gives the debugger a
   "context" in which to evaluate the variable.

   Does this mean that the variable in the data window will *only* be
   updated when that line of code is executed or only when that
   section of code (e.g. that function) is executed?  If the variable
   is global, will it be updated after every line of code or
   only at that single line where you provided the context?  The LSC
   manual mentions placing multiple references to a variable in the
   data window--can each reference have a different value depending
   upon where you provided the context vs. the line of code currently
   being executed?  What happens in cases in which you clobber a
   variable without making reference to it (for example by using a
   pointer with a wrong value and writing to the wrong section of
   memory).


2) This may not be a LSC question but I encountered the problem when
   I was using LSC so what the heck...

   Some sample code:

  ------------------------------------------------------------------

   #include "stdio.h"
   #define MY_FIRST_VARIABLE 16
   #define ANOTHER_ONE       0
   #define YET_ANOTHER_ONE   256

   #define int32  long int   /* Change to correct type to create 32, 16 */
   #define int16  int        /*   and 8 bit variables on your machine   */
   #define int8   char


   int16 data[10] = { 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 };


int  my_function (my_first_var, another_one, my_data, yet_another);

     int32   my_first_var;
     int32   another_one;
     int16   *my_data;
     int32   yet_another;

     {
     printf("yet_another = %d\n", yet_another);
	/* Or you can use the fancy new source level debugger */


     }  /* end of "my_function" */


 main()

     {

     my_function (MY_FIRST_VARIABLE, ANOTHER_ONE,
		      data, YET_ANOTHER_ONE);

     }


 --------------------------------------------------------------
  (Please ignore any minor syntax errors.  I just typed this
   in from from scratch and syntax isn't the problem anyway).


  When I tried running this, the values the variables contained inside
  the function were not the sames as the values I (thought I) passed
  in from outside the function.  Passing in a 16 would yield a 11
  inside the function for example.

  After I played around for a while, I finally discovered that the
  problem was related to the values I was passing into the
  function--in order to have my "defined" variables interpreted
  correctly, I had to cast each one of them to the same type as the
  types inside the function.  This certainly makes sense if the
  variables outside the function were a different type, but it seemed
  strange at first since I was using "define"s, not "real" variables.
  I guess the compiler had to put *something* on the stack.

   my_function ( (int32) MY_FIRST_VARIABLE, (int32) ANOTHER_ONE,
		 (int16) data, (int32) YET_ANOTHER_ONE);


  After all that, my question is:  Does LSC (or do other C compilers
  in general) default to a particular type (int, for example) in a
  case like this where the variables passed into a function are
  "declared" using define statements?


					Thanks,

					Andrew West
	       { amdahl, ames, hplabs } ...!bnrmtv!west

singer@endor.harvard.edu (Rich Siegel) (07/24/88)

In article <3653@bnrmtv.UUCP> west@bnrmtv.UUCP (andrew west) writes:
>   When you want to display a variable in the data window, you must
>   click on a line in your source which contains that variable and
>   then type the variable name (does cut/paste work?) in the data
>   window.  According to the LSC 3.0 manual, this gives the debugger a
>   "context" in which to evaluate the variable.

	Consult page 140 of the manual a bit more closely:

	"The debugger compiles the expressions you enter in the context
of the selected line in the Source window, or, if you haven't selected
a line, within the context of the current statement."

Evaluation of expressions: (page 142):

	"The debugger re-evaluates the expressions in the data window every
time your program stops. Expressions whose context isn't in the current 
function are not re-evaluated unless they have global scope [that is,
the expression refers to "static" or global storage --RS].

		--Rich

Rich Siegel
THINK Technologies



Rich Siegel
Quality Assurance Technician
THINK Technologies Division, Symantec Corp.
Internet: singer@endor.harvard.edu
UUCP: ..harvard!endor!singer
Phone: (617) 275-4800 x305

awd@dbase.UUCP (Alastair Dallas) (07/24/88)

That's an easy one.  Your problem boils down to the simple failing test 
case:

#define N 15

int func(arg)
long arg;
{

	printf("%ld", arg);

}

main()
{

	func(N);

}

The preprocessor does straight text substitution, so you get:

	func(15);

It should be obvious that 15 will be treated as an int (16-bit in LSC).
Your function expects a long.  Mystery solved.

Next!

/alastair/

shane@chianti.cc.umich.edu (Shane Looker) (07/26/88)

In article <3653@bnrmtv.UUCP> west@bnrmtv.UUCP (andrew west) writes:
!-----------------------------------------------------------------
!
!2) This may not be a LSC question but I encountered the problem when
!   I was using LSC so what the heck...
!
!   Some sample code:
!
!  ------------------------------------------------------------------
!
!   #include "stdio.h"
!   #define MY_FIRST_VARIABLE 16
!   #define ANOTHER_ONE       0
!   #define YET_ANOTHER_ONE   256
!
!   #define int32  long int   /* Change to correct type to create 32, 16 */
!   #define int16  int        /*   and 8 bit variables on your machine   */
!   #define int8   char
!
!
!   int16 data[10] = { 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 };
!
!
!int  my_function (my_first_var, another_one, my_data, yet_another);
!
!     int32   my_first_var;
!     int32   another_one;
!     int16   *my_data;
!     int32   yet_another;
!
!     {
!     printf("yet_another = %d\n", yet_another);
!	/* Or you can use the fancy new source level debugger */
!
!
!     }  /* end of "my_function" */
!
!
! main()
!
!     {
!
!     my_function (MY_FIRST_VARIABLE, ANOTHER_ONE,
!		      data, YET_ANOTHER_ONE);
                      ^^^^
      Danger, Will Robinson, Danger, Danger.  Be careful with something like
this.  LSC will pass structures on the stack.  You should pass &data to make
sure you are passing what you want.

!
!     }
!
!
! --------------------------------------------------------------
!
!  When I tried running this, the values the variables contained inside
!  the function were not the sames as the values I (thought I) passed
!  in from outside the function.  Passing in a 16 would yield a 11
!  inside the function for example.
!
!  After I played around for a while, I finally discovered that the
!  problem was related to the values I was passing into the
!  function--in order to have my "defined" variables interpreted
!  correctly, I had to cast each one of them to the same type as the
!  types inside the function.  This certainly makes sense if the
!  variables outside the function were a different type, but it seemed
!  strange at first since I was using "define"s, not "real" variables.
!  I guess the compiler had to put *something* on the stack.
!
!   my_function ( (int32) MY_FIRST_VARIABLE, (int32) ANOTHER_ONE,
!		 (int16) data, (int32) YET_ANOTHER_ONE);
!
!
!  After all that, my question is:  Does LSC (or do other C compilers
!  in general) default to a particular type (int, for example) in a
!  case like this where the variables passed into a function are
!  "declared" using define statements?
!
!					Andrew West

Your problems may be related to passing a structure on the stack (not sure)
or they may be related to the fact that a compiler will usually assume that
you are passing an (int) if it fits.  (int) varies from machine to machine 
and compiler to compiler.

I would strongly advise using prototypes.  They are very handy, since they
can not only save you on problems like this, but they allow the compiler to
know what you want to pass, not second guess you.

Shane Looker
Looker@um.cc.umich.edu