[comp.sys.ibm.pc.programmer] SUMMARY: How to generate JMP FAR 0:0 w/TASM 1.5 + TLINK 1.01

mbeck@wheaties.ai.mit.edu (Mark Becker) (03/01/90)

Hello *

[ This is a repost.. drive holding news filled up and wouldn't permit
  posts to escape. ]

Some time ago I asked for suggestions on assembling a JMP FAR 0:0
using Borland's TASM 1.5 and TLINK 1.01 .  Here is a summary of the
responses.

Several suggested I use DB statements and explicitly declare the
opcodes in hex to defeat the assembler.  This works.. but also as
suggested, isn't really readable or easy to maintain.  It does do the
job though.

qmsseq!pipkins offered some comments on how self-modifying code was
bad and a really relevent comment on how this could mess up a program
given that the Intel 80x8{8|6} processors have instruction prefetch
queues.  Changing the effective address of an instruction during a
prefetch isn't exactly a good technique.  Definitely a good way to
wind up in Deep Space.

Also, if using an 80286 or 80386 in protected mode, my code would
generate an illegal exeception and abort.  (TSR's probably wouldn't be
found in such an environment.. but it's a good idea to be aware of the
limitation!)

Now I know why Borland high level languages generate (and why several
respondents suggested):

	function DWORD
		db	4*(?)
		. . .
	leap:	jmp	dword far cs:[label]

That is the code generated by TCC 2.01. in response to C code:

	void far (*function)(void);
	main() {
		(*function)();
	}

qmsseq!pipkins suggested an alternative:

	Push	ES
	Push	BX
	RetF

I'll have to think about that.. it requires the proper data in ES:BX
and thats only established at initialization time, not during normal
execution.

howardm@Neon.Standford.EDU had a fix for the label I used that seemed
to stimulate TASM into generating the kind of code I wanted.

	d	SEGMENT at 0h
	lab     dw    ?
	d       ENDS
	. . .
	; JMP far lab
	;jmp	lab
	DB	0EAh
	DW	OFFSET LAB
	DW	SEG d

This got me thinking about what I had and I tried:

	j	segment at 0		; segment:offset of j is now..
		org	0		; .. known at compile time.
	farjump	label far
	j	ends
	;
	; Down in the code area..
	jmp	farjump

TLINK gave no complaints about forming a COM file when resolving a
.OBJ into a .COM using this construct.

One wrote:

	You are writing for Intel segmented architecture, therefore
	you are living in a silly world.

Ouch.  :-) Well, yeah.. I like 'flat' architectures too.  But
sometimes taking on a challenge is good for the soul.  And definitely
generates dark spots under one's eyes. :-) (No flames.. they'll be
cheerfully dumped to /dev/null :-) ).

This same respondent added:

	Probably incorrect segment definition of 'farptr'.

And my adding "AT 0" to the segment definition around farptr seems to
have fixed that.

----------------------------------------

With regards to the request for a 16-bit in-segment jump, I'm still
having problems.  Several people suggested code of the form:

		jmp	Next
	Next:	. . .

Yes, the jump references a label that hasn't yet been seen.  But TASM
still generates a short jump and adds in a NOP byte... I guess TASM is
establishing the address of the label by assuming it's a 16-bit offset
but then inserting code for a SHORT (1-byte) jump and filling the
unused byte with the NOP.

By a 16-bit in-segment jump, I meant a jump that would load IP
directly from memory and not depend on an offset calculation within
the processor.

Actually I found this construct in my BIOS and didn't understand why
until one late respondent pointed out that early some versions of IC's
in PC's had problems with back-to-back I/O-space references.

I would like to thank those that responded:

Brad Jones	<bjones@uhccux.uhcc.hawaii.edu>
Eric Ng		<c162-dr@zooey.berkeley.edu>
Noel		<santa@bourbaki.mit.edu>
Otto J. Makela	<otto@jyu.fi>
Jeff Pipkins	<qmsseq!pipkins@decwrl.dec.com>
Lepp{j{rvi Jouni <so-jml@stekt.oulu.fi>
Frank Whaley	<few@quad1.quad.com>
Howard A. Miller <howardm@Neon.Stanford.EDU>
Ralf Brown	<Ralf.Brown@b.gp.cs.cmu.edu>
-- 
+-----------------------------------------------+-----------------------+
| Mark Becker					| .signature under	|
| Internet: mbeck@ai.mit.edu			|	construction	|
+-----------------------------------------------+-----------------------+