[LLVMdev] LLVM supports Unicode?

geovanisouza92 at gmail.com geovanisouza92 at gmail.com
Sun Aug 28 18:12:45 PDT 2011


Thanks, Bagel and Erik!

Your replies help me so much!


2011/8/28 Erik de Castro Lopo <mle+cl at mega-nerd.com>

> geovanisouza92 at gmail.com wrote:
>
> > I'm trying create a new programming language, and I want that it have
> > Unicode support (support for read and manipulate rightly the source-code
> and
> > string literals).
>
> LLVM IR iteself only supports one string ty, which is an array of
> i8 (8 bit integers). In your compile you can use utf-8 and any
> utf8 string literal can be stored in an i8 array in the LLVM IR.
>
> For example, the LLVM backend for the DDC compiler [0] does this:
>
>   @str = internal constant [4 x i8] c"bar\00", align 8
>
>
> HTH,
> Erik
>
> [0] http://disciple.ouroborus.net/
> --
> ----------------------------------------------------------------------
> Erik de Castro Lopo
> http://www.mega-nerd.com/
> _______________________________________________
> LLVM Developers mailing list
> LLVMdev at cs.uiuc.edu         http://llvm.cs.uiuc.edu
> http://lists.cs.uiuc.edu/mailman/listinfo/llvmdev
>



-- 
@geovanisouza92 - Geovani de Souza
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20110828/2bcdeb89/attachment.html>


More information about the llvm-dev mailing list