[LLVMdev] bitcast i32 ... to i32 "magically fixes" value?

John Clements aoeullvm at brinckerhoff.org
Mon Jan 19 18:18:56 PST 2009


Here's a piece of code that behaves in a way that appears insane to  
me; I'm hoping that someone can either explain to me why this is the  
right behavior, or label it a bug.

target datalayout = "e-p:32:32:32-i1:8:8-i8:8:8-i16:16:16-i32:32:32- 
i64:32:64-f32:32:32-f64:32:64-v64:64:64-v128:128:128-a0:0:64- 
f80:128:128"
target triple = "i386-apple-darwin8"

define i32 @entry () {
	%result = invoke i32 @main_0() to label %Done unwind label %Exn
Done:
        %magically_fixed = bitcast i32 %result to i32
	ret i32 %result         ; change to %magically_fixed
;;;     ret i32 %magically_fixed

Exn:
	ret i32 16
}

define i32 @main_0() {
	ret i32 8
}

@.str = internal constant [7 x i8] c"0x%08x\00"		; <[7 x i8]*> [#uses=1]

define i32 @main(i32 %argc, i8** %argv) nounwind  {
entry:
	%tmp = call i32 @entry( ) nounwind 		; <i32> [#uses=1]
	%tmp2 = call i32 (i8*, ...)* @printf( i8* getelementptr ([7 x i8]*  
@.str, i32 0, i32 0), i32 %tmp ) nounwind 		; <i32> [#uses=0]
	ret i32 0
}

declare i32 @printf(i8*, ...) nounwind


I compile and run it like this:

lvm-as -f all.s
llvm-ld all.s.bc
./a.out


And the behavior I observe is that if I return "%magically_fixed"  
rather than "%result", I get the '0x00000008' I expect.  As is,  
though, the program prints out '0x00000001'.

Now, I'm guessing that I'm somehow misunderstanding the type system,  
and the result of the 'invoke' is not an i32... but then, why would  
the bitcast succeed?  I'm baffled.

Many thanks for any help; if I can't figure this out by Wednesday, my  
compilers class will be confused by this as well :).


John Clements



More information about the llvm-dev mailing list