[llvm] r215343 - In LVI(Lazy Value Info), originally value on a BB can only be caculated once,

Jiangning Liu liujiangning1 at gmail.com
Wed Sep 24 18:32:30 PDT 2014


Hi Dan,

2014-09-25 6:01 GMT+08:00 Daniel Berlin <dberlin at dberlin.org>:

>
>
> On Wed, Sep 24, 2014 at 12:12 AM, Jiangning Liu <liujiangning1 at gmail.com>
> wrote:
>
>> Hi Dan,
>>
>> I consider your question again, and now I think the lattice lowering
>> order in this algorithm should be "overdefined -> constant/constant_range".
>>
>> At first look, following the text book it seems the lattice value TOP
>> should be "undefined", and BOTTOM should be "overdefined". But this is not
>> true for the specific implementation in this LLVM algorithm.
>>
>
> Why?
>

See my comments below.


>
>
>>
>> In this algorithm, the lowering order is
>>
>> Overdefined(TOP) -> constant/constant_range -> Undefined(BOTTOM).
>>
>>
> What does overdefined and undefiend mean then?
>

I think "overdefined" means it is a value that can't be known as a constant
at compile time, so it might be any value. "undefined" means it is a value
we don't care about at all until we evaluate it, so before the algorithm
evaluate it, it's value is unknown.


>
>
>> In order to easily implement the algorithm, originally the BBLV is
>> initialized to be BOTTOM, and this doesn't mean the lowering start point is
>> "Undefined", and it should still be "Overdefined" instead.
>>
>
> If it's initiatlized to bottom, then that is the TOP of the lattice ;)
>
>
>> If we never touch it in the algorithm, it will be kept as it is.
>>
>
> Right.
>
>
>> This is why once we visit a value at a specific BB, the algorithm will
>> change Undefined to be Overdefined immediately at the very beginning of
>> each lowering procedure.
>>
>
> It should either be lowered to constant or overdefined.
> if lowered to constant, it may be lowered again later to overdefined.
> It should not be lowered to overdefined and then raised back to constant.
>
>
> If it is lowering to overdefined too quickly, you should make it *not
> lower* to overdefined*. When I read the patch, and saw the implementation,
> it looks like you take *overdefined* values and raise them to *constant*
> sometimes.
> Did i misread it?
>

I don't think it is to raise to constant, but lower to constant. I'd like
to say changing "undefined" to "overdefined" in current algorithm is not a
lowering, but an initialization to "overdefined" only. If a value is never
visited by the algorithm, it will be "undefined" forever, but this is a
meaningless value, and the algorithm never return it. The algorithm can
only return either overdefined or constant/constant_range.


>
>
>
>
>> If you look into the implementation details of this algorithm, you may
>> find originally the lower ordering is like that. Originally the algorithm
>> will return "overdefined"(TOP) forever for a specific (Val, BB) pair.
>>
>
> This sounds like either a buggy or conservative implementationt hen, and i
> would fix *this* issue not by raising overdefined occasionally, but by
> stopping it from getting to overdefined in the first place.
>

Again, Instead, I think it is not raising overdefined, but lowering
overdefined. So I don't think it is a bug. I admit there might be some
misleading implementation in the algorithm that is not perfect, but
conceptually, I personally think it is acceptable, although it doesn't
strictly follow the text book.


>
>
>> This could miss some optimization opportunities as I described in the
>> comment. My patch is trying to increase the number of lowering this
>> "overdefined"(TOP) value.
>>
>
>
>
>>
>> This is my current understanding, but maybe I'm wrong.
>>
>> Thanks,
>> -Jiangning
>>
>>
>> 2014-09-23 11:08 GMT+08:00 Jiangning Liu <liujiangning1 at gmail.com>:
>>
>>> Hi Dan,
>>>
>>>>
>>>> So can you explain how you aren't doing this?
>>>>
>>>> It looksl ike you think the lattice goes
>>>> undefined
>>>> overdefined
>>>> constant
>>>>
>>>> That should 100% not be the case
>>>> the lattice order should be
>>>> undefined
>>>> constant
>>>> overdefined
>>>>
>>>> undefined means you don't know
>>>> constant means it has one value.
>>>> overdefined means it has too many values
>>>>
>>>> This is a traditional value range lattice (though sometimes there are
>>>> more things in the middle).
>>>> There is no way you should go from overdefined back to constant.
>>>>
>>>> Ah, I see. Thanks for your explanation! I think you are absolutely
>>> correct, and I misunderstood lattice value 'overdefined'. The original code
>>> is like this,
>>>
>>>   if ((!BBLV.isUndefined() {
>>>      ...
>>>      return;
>>>   }
>>>
>>>   // Otherwise, this is the first time we're seeing this block.  Reset
>>> the
>>>   // lattice value to overdefined, so that cycles will terminate and be
>>>   // conservatively correct.
>>>   BBLV.markOverdefined();
>>>
>>> So this algorithm is really conservative. I think the solution might be
>>> removing this lowering or adding threshold control for this lowering. Do
>>> you have any suggestions?
>>>
>>> Thanks,
>>> -Jiangning
>>>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-commits/attachments/20140925/9981750c/attachment.html>


More information about the llvm-commits mailing list