[comp.sys.amiga.programmer] > SAS gripes

tll@nntp-server.caltech.edu (Tal Lewis Lancaster) (04/06/91)

cpca@marlin.jcu.edu.au (Colin Adams) writes:

 [cut...snip]


>I wouldn't say gcc is bug free but I have
>>found it more stable for my project than SAS or Aztec.   And I have been
>>producing 1.4 M executables with it!  So I wouldn't say gcc can't handle large
>>projects.

>yes, 1.4M executable is bigger than my small 250k exec. but I'm working
>to get it smaller not bigger :-)

yes, I hope to get that 1.4M down down a little bit too.  Maybe closer to 600K.

>  Because I am doing things that just can't be done with SAS or Aztec.

>Why not?

Well the main reason is I am creating object files greater than 32K (actually
some are around 80K).  SAS and Aztec can not handle object files > 32K.  Or to
be more precise a function call to another function in the same object file must
be < 32K apart.

Gee, why don't I just make object files < 32k (in other words write smaller C
files)?  Well I am not the one producing the C files, my computer is.  I have
been working on a compiler that uses C as the intermediate langauge and it is
the output from this compiler that is producing these large C files.  To make
the compiler produce smaller C files will mean a lot of changes to it.  I will
need to make these changes sometime...

>>SAS does have one of the better debuggers I have seen.  But its other tools
>>are geared more for small projects.  For example its make is really stupid
>>and forces duplication.

>The SAS debugger is pretty good. I have found the make utility to be
>ok, once you set it up it works fine. 

For example, if you have the following dependency

fish.o:  spam.h


LMK will try to compile spam.c.  So you are forced to use

fish.o: fish.c spam.h

Also, if you have the macro

OBJ=  [ lots of files total length > 255]

and try 

fish: $(OBJ)
	blink FROM $(OBJ) ...

This will lock up your computer and maybe even trash your hard-drive.
So you are forced to duplicate the contents of OBJ into a blink file.

>Still SAS is good enough for me.

Yes, I have to agree SAS is pretty good.  I am just pointing problems with it
that prevent me from considering it as a professional product.

>-- 
>Colin Adams                                  
>Computer Science Department                     James Cook University 
>Internet : cpca@marlin.jcu.edu.au               North Queensland
>'And on the eight day, God created Manchester'

Tal Lancaster
tll@tybalt.caltech.edu

jesup@cbmvax.commodore.com (Randell Jesup) (04/06/91)

In article <1991Apr5.173845.4404@nntp-server.caltech.edu> tll@nntp-server.caltech.edu (Tal Lewis Lancaster) writes:
>cpca@marlin.jcu.edu.au (Colin Adams) writes:
>>  Because I am doing things that just can't be done with SAS or Aztec.
>
>>Why not?
>
>Well the main reason is I am creating object files greater than 32K (actually
>some are around 80K).  SAS and Aztec can not handle object files > 32K.  Or to
>be more precise a function call to another function in the same object file must
>be < 32K apart.

	Easy, just use -r0 to turn off pc-relative bsr's (-b0 to use long
data addressing).

>fish.o:  spam.h
>
>LMK will try to compile spam.c.  So you are forced to use
>
>fish.o: fish.c spam.h

	Very annoying, true.  So use some other make (originally the make
was some separate lattice thing).

-- 
Randell Jesup, Keeper of AmigaDos, Commodore Engineering.
{uunet|rutgers}!cbmvax!jesup, jesup@cbmvax.commodore.com  BIX: rjesup  
Disclaimer: Nothing I say is anything other than my personal opinion.
Thus spake the Master Ninjei: "To program a million-line operating system
is easy, to change a man's temperament is more difficult."
(From "The Zen of Programming")  ;-)

ben@epmooch.UUCP (Rev. Ben A. Mesander) (04/06/91)

>In article <18f1e3ae.ARN128f@jsmami.UUCP> jsmoller@jsmami.UUCP (Jesper Steen Moller) writes:
[...]

>Try disassembling this (no go with OMD, but Matt's dobj does this
>just fine :) ). It's ok for the first 32k, but from there on,
>there's garbage instead (as a "text-section exceeds 32k,
>branches might be corrupt (or whatever) warns the user).
>The brances are wrong and there's garbage after them...

I had this problem when I ported GNU Chess with SAS/C 5.10. The 5.10a
release caused the error messages to go away, and it all seems cool
now. Are you using SAS/C 5.10a?

>Hope you don't rely on SAS/C too much...

Well, when it spams out on me, I shut down any extra programs I  have
running and fire up GCC :-) A good 90% of the time, SAS/C does the job.

>Greets, Jesper
>--                     __
>Jesper Steen Moller   ///  VOICE: +45 31 62 46 45
>Maglemosevej 52  __  ///  USENET: cbmehq!cbmdeo!jsmoller
>DK-2920 Charl    \\\///  FIDONET: 2:231/84.45
>Denmark           \XX/

--
| ben@epmooch.UUCP   (Ben Mesander)       | "Cash is more important than |
| ben%servalan.UUCP@uokmax.ecn.uoknor.edu |  your mother." - Al Shugart, |
| !chinet!uokmax!servalan!epmooch!ben     |  CEO, Seagate Technologies   |

jsmoller@jsmami.UUCP (Jesper Steen Moller) (04/07/91)

In article <20421@cbmvax.commodore.com>, Randell Jesup writes:

> In article <1991Apr5.173845.4404@nntp-server.caltech.edu> tll@nntp-server.caltech.edu (Tal Lewis Lancaster) writes:
> >cpca@marlin.jcu.edu.au (Colin Adams) writes:
> >>  Because I am doing things that just can't be done with SAS or Aztec.
> >
> >>Why not?
> >
> >Well the main reason is I am creating object files greater than 32K (actually
> >some are around 80K).  SAS and Aztec can not handle object files > 32K.  Or to
> >be more precise a function call to another function in the same object file must
> >be < 32K apart.
> 
> 	Easy, just use -r0 to turn off pc-relative bsr's (-b0 to use long
> data addressing).

Not easy, unfortunately. The branches aren't correct (in fact, the compiler
will produce some utter crap if the branches exceed 32k).
-b0 will turn JSR xx(pc) into JSR xx, but only for outside functions...
To optimize, the compiler changes all jsr's to BRA.(B|W), somthing
it's done ever since version 3.03. These BSRs are not affected by -r0
or -r1... A bug to me...

To prove that, I made a file of:

void f1() {f1();}
void f2() {f1();}
void f3() {f1();}
void f4() {f1();}
void f5() {f1();}
/* etc... 8000 functions all in all */

And compiled with -r0 (with lc1b, lc1 simply can't do it...)
And -v for lc2 of course...

Try disassembling this (no go with OMD, but Matt's dobj does this
just fine :) ). It's ok for the first 32k, but from there on,
there's garbage instead (as a "text-section exceeds 32k,
branches might be corrupt (or whatever) warns the user).
The brances are wrong and there's garbage after them...

Hope you don't rely on SAS/C too much...

> Randell Jesup, Keeper of AmigaDos, Commodore Engineering.

Greets, Jesper
--                     __
Jesper Steen Moller   ///  VOICE: +45 31 62 46 45
Maglemosevej 52  __  ///  USENET: cbmehq!cbmdeo!jsmoller
DK-2920 Charl    \\\///  FIDONET: 2:231/84.45
Denmark           \XX/

ben@epmooch.UUCP (Rev. Ben A. Mesander) (04/07/91)

>In article <18f44cf6.ARN12b4@jsmami.UUCP> jsmoller@jsmami.UUCP (Jesper Steen Moller) writes:
[SAS/C 5.10 problems with handliing

>Yes. The compiler warns me that there can be a problem, and there is.
>Are you using lc1b instead of lc1, then? -r0?

These are the relevant portions of my lmkfile:
CC	=lc
CFLAGS	= -cuw -b0

>It does the job 100% of the time for me at the moment (I can't use DICE
>for load-libraries...), and it is indeed a good program. I should have
>added "for >32k applications". How hungry is GCC excatly? Where can
>I get it?

You need 2.5 to 3 megs of memory and some HD space. You can ftp it from
titan.ksc.nasa.gov. I use it on my 2.5 meg Amiga 1000.

>--                     __
>Jesper Steen Moller   ///  VOICE: +45 31 62 46 45

--
| ben@epmooch.UUCP   (Ben Mesander)       | "Cash is more important than |
| ben%servalan.UUCP@uokmax.ecn.uoknor.edu |  your mother." - Al Shugart, |
| !chinet!uokmax!servalan!epmooch!ben     |  CEO, Seagate Technologies   |

colin_fox@outbound.wimsey.bc.ca (Colin Fox) (04/07/91)

>Well the main reason is I am creating object files greater than 32K (actually
>some are around 80K).  SAS and Aztec can not handle object files > 32K.  Or
>to
>be more precise a function call to another function in the same object file
>must
>be < 32K apart.
>
>Gee, why don't I just make object files < 32k (in other words write smaller C
>files)?  Well I am not the one producing the C files, my computer is.  I have
>been working on a compiler that uses C as the intermediate langauge and it is
>the output from this compiler that is producing these large C files.  To make
>the compiler produce smaller C files will mean a lot of changes to it.  I
>will
>need to make these changes sometime...
>
Hmmm - first off, if you compile with -b0 (large data), you don't have to
worry about this. Also the compiler/linker set produce ALV (Automatic Link
Vectors) to locations that are too far for a branch (32K).

>Also, if you have the macro
>
>OBJ=  [ lots of files total length > 255]
>
>and try 
>
>fish: $(OBJ)
>	blink FROM $(OBJ) ...
>
>This will lock up your computer and maybe even trash your hard-drive.
>So you are forced to duplicate the contents of OBJ into a blink file.
>

Yes, it is a known bug with BLink that you only have a 256 character command
line. SO what you should do instead is create a temporary link file, which is
done like this:

target: obj1.o obj2.o obj3.o
blink <with <(t:lmk_temp)
FROM c.o obj1.o obj2.o obj3.o
TO  target
smallcode smalldata nodebug
<

And you don't need to create a link file.

caw@miroc.Chi.IL.US (Christopher A. Wichura) (04/08/91)

In article <1991Apr5.173845.4404@nntp-server.caltech.edu> tll@nntp-server.caltech.edu (Tal Lewis Lancaster) writes:
>cpca@marlin.jcu.edu.au (Colin Adams) writes:

>Well the main reason is I am creating object files greater than 32K (actually
>some are around 80K).  SAS and Aztec can not handle object files > 32K.  Or to
>be more precise a function call to another function in the same object file must
>be < 32K apart.

What version of SAS are you using?  I could have sworn that one of the
recent ones fixed this.  I specifically remember that one of the README
files said the compiler would automatically generate a 32bit reference to
functions that were >32k away and in the same object file.

>For example, if you have the following dependency
>
>fish.o:  spam.h
>
>LMK will try to compile spam.c.  So you are forced to use
>
>fish.o: fish.c spam.h

True, that one's anoying.

>Also, if you have the macro
>
>OBJ=  [ lots of files total length > 255]
>
>and try 
>
>fish: $(OBJ)
>	blink FROM $(OBJ) ...
>
>This will lock up your computer and maybe even trash your hard-drive.
>So you are forced to duplicate the contents of OBJ into a blink file.

But you can have the LMK file do this for you easily.  You'd use something
like:

GIFMachine: $(OBJS)
	BLink <WITH < (GIFMachine.lnk)
FROM $(OBJS) version.o
TO GIFMachine
LIB $(LIBS)
$(LFLAGS)
<

LMK will create the BLink WITH file for you, and delete it automatically
once it's done with it (with an option to leave it around if you like).

-=> CAW

Christopher A. Wichura                Multitasking.  Just DO it.
caw@miroc.chi.il.us  (my amiga)                          ...the Amiga way...
u12401@uicvm.uic.edu (school account)

lofaso@triumph.tsd.arlut.utexas.edu (Bernie Lofaso) (04/08/91)

In article <1991Apr5.173845.4404@nntp-server.caltech.edu>, tll@nntp-server.caltech.edu (Tal Lewis Lancaster) writes:

> Well the main reason is I am creating object files greater than 32K (actually
> some are around 80K).  SAS and Aztec can not handle object files > 32K.  Or to
> be more precise a function call to another function in the same object file must
> be < 32K apart.

Sorry, but this is not true. The 32K limit is only for a small code model.
You can specify a large memory model for code with a compiler switch and
the object modules can be any size you want. This pertains to Aztec C but I
would expect SAS to have similar functionality.

rbabel@babylon.rmt.sub.org (Ralph Babel) (04/08/91)

In article <1991Apr8.162329.29538@kuhub.cc.ukans.edu>,
markv@kuhub.cc.ukans.edu writes:

> Sooo, last time I checked (5.1, havn't tried 5.1a, but I
> still think its a problem), SAS can NOT generate single
> functions or even modules with >32K of code.

Use option "-g". The code generated contains 32-bit
branches. It will work correctly on a 68020/030/040.

1> "do i=1 to 8000;say 'static void f'||i||'(){f1();}';end" >foo.c
1> lc -g -v foo
SAS/C Compiler V5.10a for AmigaDOS
Copyright (C) 1990 SAS Institute, Inc. All Rights Reserved.

Compiling foo.c
Code size greater than 32767 bytes - branches may be incorrect
Please put some subroutines into separate source files.
Module size P=0000CF00 D=00000000 U=00000000

Total files: 1, Compiled OK: 1
1>

> I've hunted the docs for an "implict" temp file, but
> havnt' found it, could you point me at a page number?

Page U80.

Ralph

jsmoller@jsmami.UUCP (Jesper Steen Moller) (04/08/91)

In article <ben.5690@epmooch.UUCP>, Rev. Ben A. Mesander writes:

> >In article <18f1e3ae.ARN128f@jsmami.UUCP> jsmoller@jsmami.UUCP (Jesper Steen Moller) writes:

> I had this problem when I ported GNU Chess with SAS/C 5.10. The 5.10a
> release caused the error messages to go away, and it all seems cool
> now. Are you using SAS/C 5.10a?

Yes. The compiler warns me that there can be a problem, and there is.
Are you using lc1b instead of lc1, then? -r0?

> >Hope you don't rely on SAS/C too much...
> 
> Well, when it spams out on me, I shut down any extra programs I  have
> running and fire up GCC :-) A good 90% of the time, SAS/C does the job.

It does the job 100% of the time for me at the moment (I can't use DICE
for load-libraries...), and it is indeed a good program. I should have
added "for >32k applications". How hungry is GCC excatly? Where can
I get it?

> | ben@epmooch.UUCP   (Ben Mesander)       | "Cash is more important than |
> | ben%servalan.UUCP@uokmax.ecn.uoknor.edu |  your mother." - Al Shugart, |
> | !chinet!uokmax!servalan!epmooch!ben     |  CEO, Seagate Technologies   |

--                     __
Jesper Steen Moller   ///  VOICE: +45 31 62 46 45
Maglemosevej 52  __  ///  USENET: cbmehq!cbmdeo!jsmoller
DK-2920 Charl    \\\///  FIDONET: 2:231/84.45
Denmark           \XX/

markv@kuhub.cc.ukans.edu (04/08/91)

In article <20421@cbmvax.commodore.com>, jesup@cbmvax.commodore.com (Randell Jesup) writes:
> In article <1991Apr5.173845.4404@nntp-server.caltech.edu> tll@nntp-server.caltech.edu (Tal Lewis Lancaster) writes:
>>cpca@marlin.jcu.edu.au (Colin Adams) writes:
>>fish.o:  spam.h
>>
>>LMK will try to compile spam.c.  So you are forced to use
>>
>>fish.o: fish.c spam.h
> 
> 	Very annoying, true.  So use some other make (originally the make
> was some separate lattice thing).

Or put some default rules at the beginning of the makefile like:

.o.h:
	# rules for a .h file	
.o.c: 
	# rules for a .c file

or use the basename macro like:

foo.o:	foo.h
	foocommand $*.h

However, LMK is not perfect.  Try this:

.onerror:
	type $*.err

FOO:	NonExistantTarget
	command

Then LMK will emit type $*.err and bring on the Guru, although the 
command line accepts it as an error.

LMK will accept macros >255 chars, but they need to be broken like

OBJS=a.o b.o c.o \
	d.o e.o

However, if OBJs exceeds 255 chars you overrun the OS command line
limit.  So I wish LMK would support automatic tempfiles like:

Program:	$(OBJS)

	BLINK WITH &&!$(OBJS) LIBS $(LIBS)! TO Program 

I also wish that lc et. al. would also take "with" files so you could
feed it long command lines (which can happen if you have multiple
seperate object and source dirs, etc).

As for the floating point bugs (esp. in CodeProbe), most of these have
been fixed in 5.1a.  
> -- 
> Randell Jesup, Keeper of AmigaDos, Commodore Engineering.
> {uunet|rutgers}!cbmvax!jesup, jesup@cbmvax.commodore.com  BIX: rjesup  
> Disclaimer: Nothing I say is anything other than my personal opinion.
> Thus spake the Master Ninjei: "To program a million-line operating system
> is easy, to change a man's temperament is more difficult."
> (From "The Zen of Programming")  ;-)
-- 
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Mark Gooderum			Only...		\    Good Cheer !!!
Academic Computing Services	       ///	  \___________________________
University of Kansas		     ///  /|         __    _
Bix:	  mgooderum	      \\\  ///  /__| |\/| | | _   /_\  makes it
Bitnet:   MARKV@UKANVAX		\/\/  /    | |  | | |__| /   \ possible...
Internet: markv@kuhub.cc.ukans.edu
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

markv@kuhub.cc.ukans.edu (04/08/91)

In article <colin_fox.3963@outbound.wimsey.bc.ca>, colin_fox@outbound.wimsey.bc.ca (Colin Fox) writes:
>>Gee, why don't I just make object files < 32k (in other words write smaller C
>>files)?  Well I am not the one producing the C files, my computer
> Hmmm - first off, if you compile with -b0 (large data), you don't have to
> worry about this. Also the compiler/linker set produce ALV (Automatic Link
> Vectors) to locations that are too far for a branch (32K).

No, the 32K limit is a problem.  ALVs are generated by BLINK not LCx.
The compiler will *not* generate a symbolic reference for a jump or
branch inside a given object file, so Blink has no way to generate an
ALV.  Also, the compiler *always* generates relative code for jumps
inside a module.  So ALVs and -r0 only help with executables larger
than 32K, not functions or modules.  -b0 has nothing to do with this,
it is for data addressing.  Sooo, last time I checked (5.1, havn't
tried 5.1a, but I still think its a problem), SAS can NOT generate
single functions or even modules with >32K of code.  Also (minor
gripe) oml still can't generate an indexed library >256K total size
(forces me to JOIN when I want debug info).

> Yes, it is a known bug with BLink that you only have a 256 character command
> line. SO what you should do instead is create a temporary link file, which is
> done like this:
> 
> target: obj1.o obj2.o obj3.o
> blink <with <(t:lmk_temp)
> FROM c.o obj1.o obj2.o obj3.o
> TO  target
> smallcode smalldata nodebug
> <
> 
> And you don't need to create a link file.

Huh?  I dont recoginize this construct.  I've hunted the docs for an
"implict" temp file, but havnt' found it, could you point me at a page
number?

Right now I have somthing like:

DUMMY:	Prep Foo.lnk Foo

Prep:
	#resident compiler and other misc. prep

Foo.lnk:	Foo.lmk
	echo >Foo.lnk FROM $(OBJS_1)
	echo >>Foo.lnk FROM $(OBJS_2)
	etc..
Foo:
	BLINK with Foo.lnk

At least this way everything is in one place.  Another minor niceity
would be "batching" where LMK would put as many files on one command
line as could fit, like (simple case):

.c.o:
	lc -L  {$<}

Foo:	Foo1.c Foo2.c

Would generate:

	lc -L Foo1.c Foo2.c

This is less use if your already hitting the command line limit on the
compiler (of course the ultimate solution is to take off the limit on
the command line, which should be getting possilble as BSTRs fade into
oblivion).
-- 
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Mark Gooderum			Only...		\    Good Cheer !!!
Academic Computing Services	       ///	  \___________________________
University of Kansas		     ///  /|         __    _
Bix:	  mgooderum	      \\\  ///  /__| |\/| | | _   /_\  makes it
Bitnet:   MARKV@UKANVAX		\/\/  /    | |  | | |__| /   \ possible...
Internet: markv@kuhub.cc.ukans.edu
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

jsmoller@jsmami.UUCP (Jesper Steen Moller) (04/09/91)

In article <1991Apr8.162329.29538@kuhub.cc.ukans.edu>, markv@kuhub.cc.ukans.edu writes:

> > target: obj1.o obj2.o obj3.o
> > blink <with <(t:lmk_temp)
> > FROM c.o obj1.o obj2.o obj3.o
> > TO  target
> > smallcode smalldata nodebug
> > <
> Huh?  I dont recoginize this construct.  I've hunted the docs for an
> "implict" temp file, but havnt' found it, could you point me at a page
> number?

Page U81 in my 5.00 manual...
> 
> DUMMY:	Prep Foo.lnk Foo
> 
> Prep:
> 	#resident compiler and other misc. prep
> 
> Foo.lnk:	Foo.lmk
> 	echo >Foo.lnk FROM $(OBJS_1)
> 	echo >>Foo.lnk FROM $(OBJS_2)
> 	etc..
> Foo:
> 	BLINK with Foo.lnk

Thats nice..

> Mark Gooderum			Only...		\    Good Cheer !!!

Jesper...

--                     __
Jesper Steen Moller   ///  VOICE: +45 31 62 46 45
Maglemosevej 52  __  ///  USENET: cbmehq!cbmdeo!jsmoller
DK-2920 Charl    \\\///  FIDONET: 2:231/84.45
Denmark           \XX/

tll@nntp-server.caltech.edu (Tal Lewis Lancaster) (04/09/91)

lofaso@triumph.tsd.arlut.utexas.edu (Bernie Lofaso) writes:

>In article <1991Apr5.173845.4404@nntp-server.caltech.edu>, tll@nntp-server.caltech.edu (Tal Lewis Lancaster) writes:

>> Well the main reason is I am creating object files greater than 32K (actually
>> some are around 80K).  SAS and Aztec can not handle object files > 32K.  Or to
>> be more precise a function call to another function in the same object file must
>> be < 32K apart.

>Sorry, but this is not true. The 32K limit is only for a small code model.
>You can specify a large memory model for code with a compiler switch and
>the object modules can be any size you want. This pertains to Aztec C but I
>would expect SAS to have similar functionality.

Sorry, this is true.  I have confirmed it with SAS and Aztec!  It doesn't
matter which memory model you pick.  They both have <32K function call
limitation to calls in the same object file.

Tal Lancaster

jsmoller@jsmami.UUCP (Jesper Steen Moller) (04/09/91)

In article <ben.5758@epmooch.UUCP>, Rev. Ben A. Mesander writes:

> >In article <18f44cf6.ARN12b4@jsmami.UUCP> jsmoller@jsmami.UUCP (Jesper Steen Moller) writes:
> [SAS/C 5.10 problems with handliing
> 
> >Yes. The compiler warns me that there can be a problem, and there is.
> >Are you using lc1b instead of lc1, then? -r0?
> 
> These are the relevant portions of my lmkfile:
> CC	=lc
> CFLAGS	= -cuw -b0

Ok - does this work for object files that exceed 32k in size for the
"text"-section? If yes, your're the only one to get that to work!

> >It does the job 100% of the time for me at the moment (I can't use DICE
> >for load-libraries...), and it is indeed a good program. I should have
> >added "for >32k applications". How hungry is GCC excatly? Where can
> >I get it?
> 
> You need 2.5 to 3 megs of memory and some HD space. You can ftp it from
> titan.ksc.nasa.gov. I use it on my 2.5 meg Amiga 1000.

Hmm, thanks for the info - I have no ftp access, unfortunately. Is it
floating around in the deep waters of the Fish?

> | ben@epmooch.UUCP   (Ben Mesander)       | "Cash is more important than |

Greets, Jesper

--                     __
Jesper Steen Moller   ///  VOICE: +45 31 62 46 45
Maglemosevej 52  __  ///  USENET: cbmehq!cbmdeo!jsmoller
DK-2920 Charl    \\\///  FIDONET: 2:231/84.45
Denmark           \XX/

gregg@cbnewsc.att.com (gregg.g.wonderly) (04/09/91)

From article <1991Apr8.162329.29538@kuhub.cc.ukans.edu>, by markv@kuhub.cc.ukans.edu:
> In article <colin_fox.3963@outbound.wimsey.bc.ca>, colin_fox@outbound.wimsey.bc.ca (Colin Fox) writes:
>> Yes, it is a known bug with BLink that you only have a 256 character command
>> line. SO what you should do instead is create a temporary link file, which is
>> done like this:
>> 
>> target: obj1.o obj2.o obj3.o
>> blink <with <(t:lmk_temp)
>> FROM c.o obj1.o obj2.o obj3.o
>> TO  target
>> smallcode smalldata nodebug
>> <
>> 
>> And you don't need to create a link file.
> 
> Huh?  I dont recoginize this construct.  I've hunted the docs for an
> "implict" temp file, but havnt' found it, could you point me at a page
> number?

Beware that intermediate files have <CR><LF> at the end of the lines
(left over from MSDOG).  Also, there is a limit on the number of libraries
that BLINK will accept, but not .o's.  I took a program which was 15
or so source files of about 1000 lines each, and shred(1)'d them into
separate directories and then compiled them each separately and then combined
each group into a library.  The result was that blink could not groke all
of the libraries, so I had to change some back to .o's.  There are numerous
other things wrong with larger environments.  I understand that a version
6 of the C compiler tools will be forthcomming at the end of '91, with
several tools completely rewritten.

Also, it is not intended that you do cd's from within one of the implicit
temp files.  Evidently, LMK has some locks that will cause problems if you
try it (don't know why, that is just what they told me).  I wanted to
have an LMK file that would cd to each directory and make the library
there, but had to revert to just using a make(1) program that I picked
up somewhere ages ago (not trusting lmk).

The problem now with all of the libraries is that I can not compile with
DEBUG because the libraries get too big (>256K).  I can't compile just
the modules that I want to debug with DEBUG either because CPR will barf
with some kind of error, evidently confused with only finding some symbols
in the symbol table (when it does this it leaves a lock on the executable).

-- 
-----
gregg.g.wonderly@att.com   (AT&T bell laboratories)

markv@kuhub.cc.ukans.edu (04/10/91)

>>>Yes, it is a known bug with BLink that you only have a 256 character command
>>>line.SO what you should do instead is create a temporary link file, which is
>>> done like this:
>>> 
>>> target: obj1.o obj2.o obj3.o
>>> blink <with <(t:lmk_temp)
>>> FROM c.o obj1.o obj2.o obj3.o
>>> TO  target
>>> smallcode smalldata nodebug
>>> <
>>> 
>>> And you don't need to create a link file.
>> 
>> Huh?  I dont recoginize this construct.  I've hunted the docs for an
>> "implict" temp file, but havnt' found it, could you point me at a page
>> number?

I still would like to know where in the manual this is mentioned and
the details of the syntax.  Also, I notice the < in front of the with.
Is this a typo or needed.  Acually, more generally, is this temp file
feature specific to the with option of Blink, or is it more general
purpose. 

> The problem now with all of the libraries is that I can not compile with
> DEBUG because the libraries get too big (>256K).  I can't compile just
> the modules that I want to debug with DEBUG either because CPR will barf
> with some kind of error, evidently confused with only finding some symbols
> in the symbol table (when it does this it leaves a lock on the executable).

You can make libraries bigger than 256K, just not with OML.  I do one
of two things to get around this problem.  I either get the library
under 256K by stripping debugging info from object modules (use
BLINK FOO.O NODEBUG PRELINK).  The ND option strips the debug info,
the PRELINK leaves the references intact so it is still linkable.
Otherwise, I just do:

OBJS=a.o b.o c.o d.o

Foo.lib: $(OBJS)
	join $(OBJS) as Foo.lib

Blink will recognize this as a valid link lib (in the early Amiga days
this was the only valid link lib).  It is just that linking is a bit
slower, since the lib wont have an index.

Also, you can help a little (when wanting debugging info) by useing
the ADDSYM option.  This at least puts the names of all public symbols
in the files.  CPR assumes all untagged symbols are pointers to longs,
but you can of course cast to other things.

As for CPR and an "interrupt" key, just hit CTRL-C in the command
window.  The only catch is you will often be off in space in the
system libs, which is bad.  But you can load a module (assuming you
have some idea where you are), set a breakpoint, and then safely do a
go again.  I've been getting real spoiled though now.  I've been using
Turbo Debugger on a 386 PC at work, and I sure miss hardware
break/watch points.  Can you say fast....

> -- 
> -----
> gregg.g.wonderly@att.com   (AT&T bell laboratories)
-- 
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Mark Gooderum			Only...		\    Good Cheer !!!
Academic Computing Services	       ///	  \___________________________
University of Kansas		     ///  /|         __    _
Bix:	  mgooderum	      \\\  ///  /__| |\/| | | _   /_\  makes it
Bitnet:   MARKV@UKANVAX		\/\/  /    | |  | | |__| /   \ possible...
Internet: markv@kuhub.cc.ukans.edu
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

cs326ag@ux1.cso.uiuc.edu (Loren J. Rittle) (04/10/91)

In article <24@triumph.tsd.arlut.utexas.edu> lofaso@triumph.tsd.arlut.utexas.edu (Bernie Lofaso) writes:
>In article <1991Apr5.173845.4404@nntp-server.caltech.edu>, tll@nntp-server.caltech.edu (Tal Lewis Lancaster) writes:
>
>> Well the main reason is I am creating object files greater than 32K (actually
>> some are around 80K).  SAS and Aztec can not handle object files > 32K.  Or to
>> be more precise a function call to another function in the same object file must
>> be < 32K apart.
>
>Sorry, but this is not true. The 32K limit is only for a small code model.
>You can specify a large memory model for code with a compiler switch and
>the object modules can be any size you want. This pertains to Aztec C but I
>would expect SAS to have similar functionality.

I think some of the confusion can be shown with the following example:
<start of test.c>
#define TEN(a) a;a;a;a;a;a;a;a;a;a;

void test(int a, int b, int c, int d, int e, int f);

int a, b, c, d, e, f;

void main(void)
{
a:
  TEN(TEN(test(a, b, c, d, e, f)))
  TEN(TEN(test(a, b, c, d, e, f)))
  TEN(TEN(test(a, b, c, d, e, f)))
  TEN(TEN(test(a, b, c, d, e, f)))
  TEN(TEN(test(a, b, c, d, e, f)))
  TEN(TEN(test(a, b, c, d, e, f)))
  TEN(TEN(test(a, b, c, d, e, f)))
  TEN(TEN(test(a, b, c, d, e, f)))
  TEN(TEN(test(a, b, c, d, e, f)))
  TEN(TEN(test(a, b, c, d, e, f)))
  TEN(TEN(test(a, b, c, d, e, f)))
  TEN(TEN(test(a, b, c, d, e, f)))
  goto a;
}
<end of test.c>

When compiled with `lc -L -b0 -r0 -d0 test' yields:
SAS/C Compiler V5.10a for AmigaDOS                            
Copyright (C) 1990 SAS Institute, Inc. All Rights Reserved.   
                                                              
Compiling test.c                                              
Code size greater than 32767 bytes - branches may be incorrect
Please put some subroutines into separate source files.

BUT, the code that was generated looks ok.  goto a: generated a BRA.L
instruction as expected, etc.  Why does SAS/C give me that bogus
warning?  Who knows! I only have *one* function in the source file!

ALSO, note,
TEN(TEN(TEN(test(a, b, c, d, e, f)))) caused the machine to crash... :-(

Loren J. Rittle
-- 
``The Amiga continues to amaze me--if I had not been told that this video was
  created using the Amiga and Toaster, I would not have believed it.  Even     
  Allen said, `I think I know how he did most of the effects.' '' - Jim Lange
  Loren J. Rittle  l-rittle@uiuc.edu

ben@epmooch.UUCP (Rev. Ben A. Mesander) (04/10/91)

>In article <18f5a7bb.ARN12cc@jsmami.UUCP> jsmoller@jsmami.UUCP (Jesper Steen Moller) writes:
>In article <ben.5758@epmooch.UUCP>, Rev. Ben A. Mesander writes:
>
>> >In article <18f44cf6.ARN12b4@jsmami.UUCP> jsmoller@jsmami.UUCP (Jesper Steen Moller) writes:
>> [SAS/C 5.10 problems with handling >32k funcs]
>> 
>> >Yes. The compiler warns me that there can be a problem, and there is.
>> >Are you using lc1b instead of lc1, then? -r0?
>> 
>> These are the relevant portions of my lmkfile:
>> CC	=lc
>> CFLAGS	= -cuw -b0
>
>Ok - does this work for object files that exceed 32k in size for the
>"text"-section? If yes, your're the only one to get that to work!

Well, I can't really say. GNU C is the only program that I've compiled
that had a function >32K. I had this problem with 5.05, and not with
5.10a. It could be that the 5.10a compiler is better enough that it
sucked the code size below 32K, but for whatever reason, I don't get that
error message anymore. 

I wasn't clear at the time if the problem with 5.05 was the size of the
code in an individual *file* or an individual *function*. Perhaps they
fixed the 32K limit on the file and not on individual functions?

Pure speculation.

[where to get GCC]

>Hmm, thanks for the info - I have no ftp access, unfortunately. Is it
>floating around in the deep waters of the Fish?

No, but there are mailservers that can send it to you through the mail.
c.s.a.datacomm might be the place to ask about this...

One more thing - the SAS/C optimizer breaks on GNU Chess...

>Jesper Steen Moller   ///  VOICE: +45 31 62 46 45

--
| ben@epmooch.UUCP   (Ben Mesander)       | "Cash is more important than |
| ben%servalan.UUCP@uokmax.ecn.uoknor.edu |  your mother." - Al Shugart, |
| !chinet!uokmax!servalan!epmooch!ben     |  CEO, Seagate Technologies   |

ben@epmooch.UUCP (Rev. Ben A. Mesander) (04/11/91)

>In article <1991Apr11.151049.617@eve.wright.edu> amercer@desire.wright.edu (Art Mercer) writes:
>From article <ben.5810@epmooch.UUCP>, by ben@epmooch.UUCP (Rev. Ben A. Mesander):
>[..Stuff omitted..]
>> One more thing - the SAS/C optimizer breaks on GNU Chess...
>
>I haven't seen the code, but any GNU software that uses their "regexp" stuff
>will break the optimizer.  I just break-out the regex in the make file so
>that the optimizer is not called.

GNU Chess doesn't use the regexp package. The GCC versions have some
funny CFLAGS concerning optimization:

CFLAGS=	-O -finline-functions -fstrength-reduce

I'm at home, so I don't have my GCC manual, but I'm pretty sure that the
-finline-functions tells it to inline "simple" functions. I can't remember
exactly what -fstrength-reduce does, but it is another sort of optimization.
The Chess code can be described as "hairy", I think.

I originally ported it with Lattice 5.05, and I had to butcher the code
mostly concerning function prototypes with "short" in them so badly that
my copy of the code will no longer compile with GCC. It works just fine
with SAS/C 5.10a. I tried the unmodified source on my Sun 3 at work with
GCC, and it compiled just fine.

>Art Mercer
>	amercer@eve.wright.edu

--
| ben@epmooch.UUCP   (Ben Mesander)       | "Cash is more important than |
| ben%servalan.UUCP@uokmax.ecn.uoknor.edu |  your mother." - Al Shugart, |
| !chinet!uokmax!servalan!epmooch!ben     |  CEO, Seagate Technologies   |

chucks@pnet51.orb.mn.org (Erik Funkenbusch) (04/12/91)

lofaso@triumph.tsd.arlut.utexas.edu (Bernie Lofaso) writes:
>In article <1991Apr5.173845.4404@nntp-server.caltech.edu>, tll@nntp-server.caltech.edu (Tal Lewis Lancaster) writes:
>
>> Well the main reason is I am creating object files greater than 32K (actually
>> some are around 80K).  SAS and Aztec can not handle object files > 32K.  Or to
>> be more precise a function call to another function in the same object file must
>> be < 32K apart.
>
>Sorry, but this is not true. The 32K limit is only for a small code model.
>You can specify a large memory model for code with a compiler switch and
>the object modules can be any size you want. This pertains to Aztec C but I
>would expect SAS to have similar functionality.


And actually, even in small code and data it can still be >32k.  if it is,
though  it will incurr a loss of performance do to the jump table lookups.
UUCP: {amdahl!tcnet, crash}!orbit!pnet51!chucks
ARPA: crash!orbit!pnet51!chucks@nosc.mil
INET: chucks@pnet51.orb.mn.org

amercer@desire.wright.edu (Art Mercer) (04/12/91)

From article <ben.5810@epmooch.UUCP>, by ben@epmooch.UUCP (Rev. Ben A. Mesander):
[..Stuff omitted..]
> One more thing - the SAS/C optimizer breaks on GNU Chess...

I haven't seen the code, but any GNU software that uses their "regexp" stuff
will break the optimizer.  I just break-out the regex in the make file so
that the optimizer is not called.

> 
>>Jesper Steen Moller   ///  VOICE: +45 31 62 46 45
> 
> --
> | ben@epmooch.UUCP   (Ben Mesander)       | "Cash is more important than |
> | ben%servalan.UUCP@uokmax.ecn.uoknor.edu |  your mother." - Al Shugart, |
> | !chinet!uokmax!servalan!epmooch!ben     |  CEO, Seagate Technologies   |


Art Mercer
Associate Director, Academic Computing Resources
040T Library Annex
Wright State University
Col. Glenn HWY
Dayton, OH 45435

EMail:
	amercer@eve.wright.edu
 <or>
	amercer@wright.bitnet
PHONE:	(513)873-4038

---