[gnu.gcc.bug] bug in handling bit patterns

rgoguen@bbn.com (Robert J Goguen) (12/01/88)

I'm building a port to the 3B2. The memory of my machine is as follows.



Increasing addresses ------------------------>>>>>>>>

3
1                                             7 6 5 4 3 2 1 0
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
              |               |              |              |
  HIGH WORD   |               |              |  LOW WORD    |
              |               |              |              |
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
m                                                           l
s                                                           s
b                                                           b

I have BITS_BIG_ENDIAN, BYTES_BIG_ENDIAN and WORDS_BIG_ENDIAN defined.

The most significant bit in a byte is lowest numbered, the most significant
byte in a word is lowest numbered and the most significant word in a multi
word is lowest numbered, so the above defines are right.

	Gcc produces wrong rtl when given the following source code:

struct foo {
     int twobit:2;
     int       :1;
     int threebit:3;
     int onebit:1;
   }

main()
{
   struct foo s3 ;

   s3.onebit = 1;
   printf("s3.onebit = %d\n",s3.onebit);
}

using gcc -dg -S tst.c with the above source has tst.c.greg output is as 
follows :

;; Function main

;; Register dispositions: 16 in 0 

;; Hard regs used:  0 10 12

(note 1 0 2 "" -1)

(note 2 1 3 "" -2)

(insn 3 2 4 (set (mem/s:QI (plus:SI (reg:SI 10)
               (const_int 3)))
       (ior:QI (mem/s:QI (plus:SI (reg:SI 10)
                   (const_int 3)))
           (const_int 2))) 61 (nil)	<<<<!!!! This should be 64
   (nil))

(insn 4 3 5 (set (mem:SI (pre_inc:SI (reg:SI 12)))
       (symbol_ref:SI ("*.LC0"))) 18 (nil)
   (nil))

(insn 5 4 6 (set (reg:SI 0)
	(zero_extract:SI (mem/s:QI (plus:SI (reg:SI 10)
                   (const_int 3)))
           (const_int 1)
           (const_int 6))) 84 (nil)
   (nil))

(insn 6 5 7 (set (mem:SI (pre_inc:SI (reg:SI 12)))
       (reg:SI 0)) 18 (nil)
   (nil))

(call_insn 7 6 8 (set (reg:SI 0)
       (call (mem:QI (symbol_ref:SI ("printf")))
           (const_int 8))) 113 (nil)
   (nil))

(note 8 7 9 "" -3)

(note 9 8 10 "" -6)

(code_label 10 9 0 1)

The assemble language produced from this rtl is as follows:

.text
.LC0:
        .byte   115,51,46,111,110,101,98,105,116,32,61,32,37,100
        .byte   10,00
        .align 4
.globl main
main:
        save &0
        ADDW2 &4,%sp
        ORB2 &2,3(%fp)		<<<!!!! This Should be ORB2 &64,3(%fp)
        PUSHW &.LC0
        EXTFB &1,&6,3(%fp),%r0  <<<!!! Notice it extracts from the 6th bit
        PUSHW %r0
        call &2,printf
.L1:
        SUBW2 &4,%sp
        ret &0


	The gcc front end puts a one in the wrong bit position, But the
extraction of the bit is in the right position. When I use gcc compiled with
the AT&T cc I get the gcc compiler. Whe I use this compiler to compile the
gcc sources I get a gcc compiler compiled with gcc. When I use this compiler
to compile a simple program like the following:

main()
{
}

	The compiler crashes and I get all sorts of weird messages from
parts of the compiler that it shouldn't be in. I think this has to do with
bit pattern problems in the gcc compiler compiled with AT&T cc. I pass 
almost all of the tests from the c-torture-test except the bit pattern 
tests.

	Also why isn't the insv pattern matched (Which is used correctly in
other bit patterns)


Bob Goguen