[LLVMdev] load bytecode from string for jiting problem

Willy WOLFF willy.wolff at etu.unistra.fr
Wed Mar 19 10:32:38 PDT 2014


I mad the change, and still have the problem.

I investigate more the source code of llvm.

First, I change isRawBitcode function to print the content of the 
parameter like this:
original: 
http://llvm.org/docs/doxygen/html/ReaderWriter_8h_source.html#l00081
   inline bool isRawBitcode(const unsigned char *BufPtr,
                            const unsigned char *BufEnd) {
     // These bytes sort of have a hidden message, but it's not in
     // little-endian this time, and it's a little redundant.
	  errs()<< "isRawBitcode output:\n";
	  for (int i = 0; i < 4; i++)
		  errs() << BufPtr[i] << "\n";
	  if (BufPtr != BufEnd )
		errs() << "BP != BE ok\n";
	  if (BufPtr[0] == 'B')
		errs() << "B ok\n";
	  if (BufPtr[1] == 'C')
		errs() << "C ok\n";
	  if (BufPtr[2] ==  0xc0)
		errs() << "0xc0 ok\n";
	  if (BufPtr[3] ==  0xde)
		errs() << "0xde ok\n";

     return BufPtr != BufEnd &&
            BufPtr[0] == 'B' &&
            BufPtr[1] == 'C' &&
            BufPtr[2] == 0xc0 &&
            BufPtr[3] == 0xde;
   }


Second, I change ParseBitcodeInto as this:
original: 
http://llvm.org/docs/doxygen/html/BitcodeReader_8cpp_source.html#l01971
...
	errs() << "parsebitcodeinto sniff the signature\n";
	uint32_t bvar = Stream.Read(8);
			errs() << "B :" << bvar << "\n";
	if (bvar != 'B') {
		errs() << "B :" << bvar << "\n";
		return Error(InvalidBitcodeSignature);
	}

	if (Stream.Read(8) != 'C') {
		errs() << "C\n";
		return Error(InvalidBitcodeSignature);
	}
	if (  Stream.Read(8) != 0xc0 ) {
		errs() << "0xc0\n";
		return Error(InvalidBitcodeSignature);
	}
	if (  Stream.Read(8) != 0xde ) {
		errs() << "0xde\n";
		return Error(InvalidBitcodeSignature);
	}
	// if (Stream.Read(8) != 'B' ||
	//     Stream.Read(8) != 'C' ||
	//     Stream.Read(4) != 0x0 ||
	//     Stream.Read(4) != 0xC ||
	//     Stream.Read(4) != 0xE ||
	//     Stream.Read(4) != 0xD
	// 	) {
...



The output of the code is :


isRawBitcode output:
B
C


BP != BE ok
B ok
C ok
0xc0 ok
0xde ok

parsebitcodeinto sniff the signature
B :37
B :37




It's possible that Stream object is not correctly initialized?

On 03/13/2014 06:37 PM, Will Dietz wrote:
> On Thu, Mar 13, 2014 at 9:02 AM, Willy WOLFF <willy.wolff at etu.unistra.fr> wrote:
>> Hello,
>>
>> I having a weird problem while writing a bytecode module to a string,
>> and after read/parse it for unsing on a jit.
>>
>> I write a pass to export function to module, and put this module inside
>> a global variable.
>> I use WriteBitcodeToFile for this.
>> For debuging, after this write, I try to load the exported module with
>> parseBitcodeFile.
>> This two step works.
>>
>>
>>
>> After, while the compiled program is running, I try to read and parse
>> this global variable for jiting the function.
>>
>> 1) I read the global variable with
>>    StringRef sr (gv, gv_length);
>>
>> 2) I manually test this bytecode by
>> (inspired by  inline bool isRawBitcode(const unsigned char *BufPtr,
>> const unsigned char *BufEnd) at
>> http://llvm.org/docs/doxygen/html/ReaderWriter_8h_source.html#l00067)
>>    if (sr.str()[0] == 'B')
>>      std::cout << "B ok\n";
>>    if (sr.str()[1] == 'C')
>>      std::cout << "C ok\n";
>>    if (sr.str()[2] == (char) 0xc0)
>>      std::cout << "0xc0 ok\n";
>>    if (sr.str()[3] == (char) 0xde)
>>      std::cout << "0xde ok\n";
>>
>> 3) I try to parse the gv by
>>    MemoryBuffer* mbjit = MemoryBuffer::getMemBuffer (sr.str());
>
> Not sure if this is your issue, but should be fixed anyway:
>
> The std::string created by "sr.str()" ends its lifetime in this
> statement, and MemoryBuffer for efficiency reasons
> avoids copying data it doesn't have to (like StringRef) so will be
> referencing the freed memory.
>
> To resolve this:
> * Pass MemoryBuffer your StringRef directly
> * Use getMemBufferCopy()
> * Preserve the result of sr.str() into a stack variable and pass that
> to getMemoryBuffer() instead.
>
> As a final note, check if your bitcode buffer "string" is
> null-terminated or not.  If not, be sure to be careful and
> do things like informing MemoryBuffer that this is the case.
>
> Hope this helps,
> ~Will
>
>>    LLVMContext& context = getGlobalContext();
>>    ErrorOr<Module*> ModuleOrErr = parseBitcodeFile (mbjit, context);
>>    if (error_code EC = ModuleOrErr.getError())
>>    {
>>      std::cout << ModuleOrErr.getError().message() << "\n";
>>      assert(false);
>>    }
>>
>>
>>
>>
>> This is the execution result:
>> B ok
>> C ok
>> 0xc0 ok
>> 0xde ok
>> Invalid bitcode signature
>>
>>
>>
>> Ok is not working :/
>> But why ???
>>
>>
>>
>> For debuging, between 2) and 3), I export the readed module and write to
>> a file on my hard drive,
>> and try llvm-dis, and the dissasembly of the module works.
>>
>> Wath's wrong? Any idea for solve this problem?
>>
>> Thanks you very much.
>>
>> Regards,
>> Willy
>> _______________________________________________
>> LLVM Developers mailing list
>> LLVMdev at cs.uiuc.edu         http://llvm.cs.uiuc.edu
>> http://lists.cs.uiuc.edu/mailman/listinfo/llvmdev



More information about the llvm-dev mailing list