[LLVMdev] Expected behavior of calling bitcasted functions?
Arsenault, Matthew
Matthew.Arsenault at amd.com
Wed May 29 17:40:06 PDT 2013
Hi,
I'm not sure what the expected behavior of calling a bitcasted function is. Suppose you have a case like this (which you get on the source level from attribute alias):
@alias_f32 = alias bitcast (i32 (i32)* @func_i32 to float (float)*)
define internal i32 @func_i32(i32 %v) noinline nounwind {
entry:
ret i32 %v
}
define void @bitcast_alias_scalar(float* noalias %source, float* noalias %dest) nounwind {
entry:
%arrayidx = getelementptr float* %source, i32 0
%tmp2 = load float* %arrayidx, align 8
%call = call float @alias_f32(float %tmp2) nounwind
%arrayidx8 = getelementptr float* %dest, i32 0
store float %call, float* %arrayidx8, align 8
ret void
}
If you run opt -instcombine on this, this transforms into
define void @bitcast_alias_scalar(float* noalias %source, float* noalias %dest) nounwind {
entry:
%tmp2 = load float* %source, align 8
%0 = fptoui float %tmp2 to i32
%call = call i32 @func_i32(i32 %0) nounwind
%1 = uitofp i32 %call to float
store float %1, float* %dest, align 8
ret void
}
Note the fptoui / uitofp conversions to the underlying function's argument / return type. I would expect this to bitcast the arguments and call the underlying function. This transformation happens in InstCombiner::transformConstExprCastCall. This kind of thing almost seems intentional though from some of the comments and tests, so I'm not sure what's supposed to be going on. A conversion that changes the bits doesn't make any sense to me.
More information about the llvm-dev
mailing list