[lldb-dev] Was this an unintended consequence of the Args switch to StringRef's?

Jim Ingham via lldb-dev lldb-dev at lists.llvm.org
Wed Apr 5 15:36:58 PDT 2017

memory write's argument ingestion was changed as part of the StringRefifying of Args so that we get:

(lldb) memory write &buffer 0x62
error: '0x62' is not a valid hex string value.

That seems unexpected and not desirable.  What's going on is that the default format is hex, and if the format is hex, the command also supports:

(lldb) memory write -f x &buffer 62 
(lldb) fr v/x buffer[0]
(char) buffer[0] = 0x62

The StringRef version of the args parsing is:

      case eFormatDefault:
      case eFormatBytes:
      case eFormatHex:
      case eFormatHexUppercase:
      case eFormatPointer:
        // Decode hex bytes
        if (entry.ref.getAsInteger(16, uval64)) {

The problem is that passing "0x62" to getAsInteger with a radix of 16 rejects "0x62".

We do want to hint the radix.  But it seems weird to reject an explicit indicator.  Is there some clever way to use the StringRef functions to get the desired effect, or do I have to hack around this by manually stripping the 0x if I see it?


More information about the lldb-dev mailing list