[lldb-dev] Was this an unintended consequence of the Args switch to StringRef's?
Zachary Turner via lldb-dev
lldb-dev at lists.llvm.org
Wed Apr 5 16:18:18 PDT 2017
What if you write this:
// Explicitly try hex. This will catch the case where there is no 0x
prefix.
if (entry.ref.getAsInteger(16, uval64)) {
// If that fails, let the algorithm auto-detect the radix. This will
catch the case where there is a 0x prefix.
if (entry.ref.getAsInteger(0, uval64)) {
Would that work? Alternatively, we could probably update the
implementation of getAsInteger in LLVM to you to manually hint the radix
AND prefix the number, as long as the prefix matches the hint.
On Wed, Apr 5, 2017 at 4:10 PM Jim Ingham <jingham at apple.com> wrote:
> >
>
> I think somebody was being too clever. The real purpose for specifying
> the format is to set the byte size of the item written down:
>
> OptionValueUInt64 &byte_size_value =
> m_format_options.GetByteSizeValue();
> size_t item_byte_size = byte_size_value.GetCurrentValue();
>
> and to do some special magic for c-strings and the like. Somebody (not
> me) must have thought overloading "hex" format to mean "you can drop the
> 0x" was a tasty little side benefit.
>
> But it's been that way forever, and I don't know who's relying on that
> behavior, so I'd rather not change it (intentionally) if we don't have to.
>
> Jim
>
> > On Wed, Apr 5, 2017 at 3:37 PM Jim Ingham via lldb-dev <
> lldb-dev at lists.llvm.org> wrote:
> > memory write's argument ingestion was changed as part of the
> StringRefifying of Args so that we get:
> >
> > (lldb) memory write &buffer 0x62
> > error: '0x62' is not a valid hex string value.
> >
> > That seems unexpected and not desirable. What's going on is that the
> default format is hex, and if the format is hex, the command also supports:
> >
> > (lldb) memory write -f x &buffer 62
> > (lldb) fr v/x buffer[0]
> > (char) buffer[0] = 0x62
> >
> > The StringRef version of the args parsing is:
> >
> > case eFormatDefault:
> > case eFormatBytes:
> > case eFormatHex:
> > case eFormatHexUppercase:
> > case eFormatPointer:
> > // Decode hex bytes
> > if (entry.ref.getAsInteger(16, uval64)) {
> >
> > The problem is that passing "0x62" to getAsInteger with a radix of 16
> rejects "0x62".
> >
> > We do want to hint the radix. But it seems weird to reject an explicit
> indicator. Is there some clever way to use the StringRef functions to get
> the desired effect, or do I have to hack around this by manually stripping
> the 0x if I see it?
> >
> > Jim
> >
> >
> > _______________________________________________
> > lldb-dev mailing list
> > lldb-dev at lists.llvm.org
> > http://lists.llvm.org/cgi-bin/mailman/listinfo/lldb-dev
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/lldb-dev/attachments/20170405/040c3daa/attachment.html>
More information about the lldb-dev
mailing list