[llvm-dev] Spurious peeling in simple loop unrolling

Thomas Preud'homme via llvm-dev llvm-dev at lists.llvm.org
Tue Mar 9 05:13:30 PST 2021


On 08/03/2021 16:50, Florian Hahn wrote:


On Mar 4, 2021, at 15:53, Thomas Preud'homme <thomasp at graphcore.ai<mailto:thomasp at graphcore.ai>> wrote:

Hi,

On 04/03/2021 13:32, Florian Hahn wrote:

1. Only peel if the PHI that becomes invariant has any concrete users; if the only users of a phi that becomes invariant are things like llvm.assume, peeling should be very unlikely to be beneficial (note that currently peeling also seems to happily peel for completely unused phis)


In my example the %34 phi is used in a getelementptr that's used for a load. No assume involved. That phi alone would still cause the issue.


Oh right, there are multiple redundant phis! So that does indeed not apply in this case. It might still be beneficial in general.


Certainly



2. Instcombine before peeling already simplifies the IR so that both incoming values are bit casts of the same value. It probably would be trivial to also have instcombine simplify pointer phis if the incoming values striped by pointer casts are equal. (There might be some other reason why we are not doing this at the moment though).


As mentioned above, it's not as simple as bitcast of the same pointer so this would not work here. One would have to go look at whether the loads are equivalent which is a more involved check.


That’s true, but I think instcombine already CSE’d the loads. So if we simplify such phis, the unnecessary peeling should not happen https://reviews.llvm.org/D98058


I tried the patch (thanks) but that did not remove any of the PHI (the 2 loads are still there and thus the bitcast don't appear to have the same source). I'll try tolook at InstCombine to see why loads are not CSE'd.


Thanks for the help. Best regards,


Thomas
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20210309/34717495/attachment.html>


More information about the llvm-dev mailing list