[LLVMdev] recommended workaround for distinguishing signed vs unsigned integers

Bruce Hoult bruce at hoult.org
Wed Feb 18 22:27:47 PST 2015


That might be useful if your *CPU* distinguished them (which no current
CPUs do -- there are signed and unsigned *operations*, not signed and
unsigned values).

It is irrelevant to your programming language. You do your type checking in
your compiler, and emit LLVM code with i32 values the same as you'd emit
machine code with 32 bit values that are neither signed nor unsigned.

On Thu, Feb 19, 2015 at 7:04 PM, Timothee Cour <
timothee.cour2+llvm at gmail.com> wrote:

> Since llvm doesn't distinguish signed vs unsigned integers anymore, what
> is the recommended way to represent a language that distinguishes them? Is
> that to introduce new types, eg:
> %SignedI32 = type { i32 }
> %UnsignedI32 = type { i32 }
> ?
>
> _______________________________________________
> LLVM Developers mailing list
> LLVMdev at cs.uiuc.edu         http://llvm.cs.uiuc.edu
> http://lists.cs.uiuc.edu/mailman/listinfo/llvmdev
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20150219/9456ea9a/attachment.html>


More information about the llvm-dev mailing list