[comp.sys.apple2] TeX

acmfiu@serss0.fiu.edu (ACMFIU) (01/30/91)

ok texhackers, here's what's going on with tex on the GS:

	o it is being worked on. i am doing this at the present time.
	o the conversion is being done via TeX->pascal->c. this is a very
	  tedious process. however, this much as been done.
	o because of bugs in orca/c you will not see tex on the gs until
	  mikey decides to fix bugs in orca/c (which he will start soon).
	o because of the above, i'm working on the conversion on a Sun at
	  school so i can maybe get it working. however, we're tight on space
	  so i might have to hold soon (disk at 98% capacity right now).
	o if anyone has orca/pascal, call mike NOW and demand that he let
	  arrays/records be larger than 64K for the next release of pascal.
	  if this is done, TeX will be done in a week.
	o on the note of dvi drivers, Doug McIntyre ported Nelson Beebe's
	  imagewriter II driver to the IIgs. this might be what others are
	  referring to.
	o when TeX is complete, be warned that you will need about 2+ megs
	  to run the thing. however, i will probably put extensive work into
	  modifying TeX to allocate memory dynamically. TeX is a total kluge,
	  a very poorly written program. i am quite shocked that knuth wrote
	  this, yet am happy he did as the program is beautiful.
	o the only other modification i want to make to TeX is to get rid of
	  the tex.pool file. this makes it particularly dangerous when future
	  minor upgrades of TeX come out (i.e. go through the entire conversion
	  process over again). i have a shell script now that is doing all the
	  work and i hope human intervention never happens.

well, that's about it. i will be sure to notify you of things as they happen.
incidentally, i have TeX split up into 45+ files at home on the GS. i have
been able to see parts of TeX running during preliminary tests but, as stated
before, bugs in orca/c just don't make it worth my while to beat my head against
the computer to continue this on the GS.

albert

ericmcg@pnet91.cts.com (Eric Mcgillicuddy) (02/04/91)

64k arrays are not a problem, if are willing to go to the trouble of breaking
up your data into segments. 64k is a HUGE amount of memory, what could you
possibly store that requires more than this present at any one time? 2Megs
would be nice, but design it to work on 1Meg if you want anybody to use it.

BTW, Some Major obstacles have been crossed with ViM and I hope to have it
ready soon. Although I call it Virtual Memory, it is more of an automatic
overlay system. This help you out a bit, but you MUST break up your data (and
code to some extent) t otake advantage of it. I had posted a preliminary list
of restrictions a few months back, I can repost if interested.

BTW, ViM allows up 16Meg of backing storage.

UUCP: bkj386!pnet91!ericmcg
INET: ericmcg@pnet91.cts.com

gwyn@smoke.brl.mil (Doug Gwyn) (02/05/91)

In article <439@generic.UUCP> ericmcg@pnet91.cts.com (Eric Mcgillicuddy) writes:
>64k arrays are not a problem, if are willing to go to the trouble of breaking
>up your data into segments.

Or in C, they shouldn't be a problem if you malloc() the space rather
than using static allocation.

acmfiu@scs.fiu.edu (ACMFIU) (02/05/91)

In article <439@generic.UUCP> ericmcg@pnet91.cts.com (Eric Mcgillicuddy) writes:
>64k arrays are not a problem, if are willing to go to the trouble of breaking
>up your data into segments. 64k is a HUGE amount of memory, what could you
>possibly store that requires more than this present at any one time? 2Megs
>would be nice, but design it to work on 1Meg if you want anybody to use it.
>
>BTW, Some Major obstacles have been crossed with ViM and I hope to have it
>ready soon. Although I call it Virtual Memory, it is more of an automatic
>overlay system. This help you out a bit, but you MUST break up your data (and
>code to some extent) t otake advantage of it. I had posted a preliminary list
>of restrictions a few months back, I can repost if interested.

Have you ever taken a look at the source for TeX. There are several
"types" of TeX. By this I mean that there are certain people that distribute
TeX with small/big memory constraints. The one I am using at present has
TeX using 1meg+ of data. Yes, just data, no code. As a matter of fact, I 
don't even think the program can fit on an 800K disk (i can't remember as
it's been awhile since i worked on it on the GS, but continue to work with
it on the Sun). Anyway, TeX keeps everything in arrays. Due to the fact
that TeX is to be extremely portable, Don Knuth decided not to make any
assumptions on the system TeX would be ported to. That is one reason why
TeX produces the same output on every machine it exists on (else it is
not formally called "TeX").

Once the conversion is complete, I plan on making some major modifications
to the way TeX code looks. Primarily, I'd like to get rid of the TeX.pool
file and make all memory dynamically allocated. I haven't taken a look at
this extensively yet because I first want to get this thing done.

As to the speed consideration some people have been talking about, well I
don't know yet how fast it will run. I don't have any accelerator on my
GS so if I use it then you can probably rest assured it runs adequately
fast. I've had certain TeX documents on the IBM version I work on that
just screech TeX to a half for seconds/minutes. I'll be sure to test times
on both the IBM and the GS once things are done.

Oh yeah, one thing i forgot to mention above. I am not using the default
memory constraints defined in the original tex.web program. This is far
too small for those of you wishing to use LaTeX, BibTeX, AMSTeX, etc.
I forget now the difference but I'll be sure to have a version of TeX with
just these memory constraints (the original TeX, that is), and others
with big, bigger memory constraints. Of course this will be soon after
TeX is complete because removing the tex.pool file and making all memory
dynamically allocatable is not an overnight project.

I can't vouch for the speed of the previewer. All that stuff will come later,
if not by me then by other interested parties who want to extend TeX.

albert

jlevitsky@gnh-applesauce.cts.com (Joshua Levitsky) (02/05/91)

I have never heard of TeX. What all does it do? I use Orca/C and Orca/Pascal...
are you using the latest versions... C (1.1), Pascal (2.1)??? I think there
were some big bugs squashed in 2.1.
-Joshie EMT

INET: jlevitsky@gnh-applesauce.cts.com
UUCP: crash!pnet01!gnh-applesauce!jlevitsky
ARPA: crash!pnet01!gnh-applesauce!jlevitsky@nosc.mil

acmfiu@serss0.fiu.edu (ACMFIU) (02/05/91)

In article <m0j31wy-0000FiC@jartel.info.com> jlevitsky@gnh-applesauce.cts.com (Joshua Levitsky) writes:
>I have never heard of TeX. What all does it do? I use Orca/C and Orca/Pascal...
>are you using the latest versions... C (1.1), Pascal (2.1)??? I think there
>were some big bugs squashed in 2.1.
>-Joshie EMT

when did orca/pascal 2.1 come out. last i heard 1.2 was available and mike
would work on some bug fixes soon, but not updates to the pascal compiler.
go to your favorite book store and check out "The TeXbook" for info about
TeX. knuth wrote it so he can give you more info than i :)

as to some "big bugs squashed". well, there were quite a few bugs squashed
between orca/c v1.0 and 1.1. some people here have hailed mike as some kind
of "hero" for giving out v1.1 free to all v1.0 owners. well, either they
didn't see the number of bugs mike fixed between the two versions or they
get money from byteworks for that stuff (no harm intended).

oh yeah, the next time you talk to mike demand that the next releae of
orca/pascal enable you to have arrays > 64K. it's quite lame to break up your
program into 64K chunks. ever look at the code to compress to handle this
on those brain-dead XENIX systems.

albert