[LLVMdev] should InstCombine preserve @llvm.assume?

Jingyue Wu jingyue at google.com
Wed Jun 10 10:00:56 PDT 2015


Thanks! Filed as https://llvm.org/bugs/show_bug.cgi?id=23809.

On Wed, Jun 10, 2015 at 5:32 AM, Hal Finkel <hfinkel at anl.gov> wrote:

> ----- Original Message -----
> > From: "Jingyue Wu" <jingyue at google.com>
> > To: "LLVM Developers Mailing List" <llvmdev at cs.uiuc.edu>
> > Cc: "David Majnemer" <david.majnemer at gmail.com>, "Hal Finkel" <
> hfinkel at anl.gov>
> > Sent: Wednesday, June 10, 2015 1:56:54 AM
> > Subject: should InstCombine preserve @llvm.assume?
> >
> >
> >
> > Hi,
> >
> >
> > I have some WIP that leverages @llvm.assume in some optimization
> > passes other than InstCombine. However, it doesn't work yet because
> > InstCombine removes @llvm.assume calls that are useful for later
> > optimizations. For example, given
> >
> >
> >
> > define i32 @foo(i32 %a, i32 %b) {
> > %sum = add i32 %a, %b
> > %1 = icmp sge i32 %sum, 0
> > call void @llvm.assume(i1 %1)
> > ret i32 %sum
> > }
> >
> >
> > "opt -instcombine" emits
> >
> >
> >
> > define i32 @foo(i32 %a, i32 %b) {
> > %sum = add i32 %a, %b
> > ret i32 %sum
> > }
> >
> >
> > removing the @llvm.assume call so later optimizations won't know sum
> > is non-negative any more. (Note that the opt I use here is with
> > http://reviews.llvm.org/D10283 patched. This patch fixes a bug in
> > ValueTracking).
> >
> >
> > The reasons that InstCombine removes this assume call are:
> > 1) SimplifyICmpInst proves %1 is true based on the assume call.
> > 2) then, InstCombine (
> >
> http://llvm.org/docs/doxygen/html/InstCombineCompares_8cpp_source.html#l02649
> > ) replaces all uses of %1 with true including the use in the assume
> > call.
> > 3) Somewhere later, llvm.assume(true) is considered trivially dead
> > and thus removed from the IR.
>
> This is a bug; my guess is that the context instruction is not being set
> correctly in this case. The @llvm.assume should never be used to simplify
> instructions that are used only by the @llvm.assume. If the context
> instruction is set correctly, the logic in isValidAssumeForContext should
> prevent this.
>
> Feel free to file a bug with this IR and I'll look at it.
>
> >
> >
> > Step 2 looks particularly problematic to me. Removing @llvm.assume
> > essentially throws away the base of the proof so we won't be able to
> > make the same proof any more.
> >
> >
> >
> > How can we fix this issue? One heavy-handed approach is, instead of
> > RAUW the icmp, we only replace the uses that are not by
> > @llvm.assume. But I have two concerns.
> >
> >
> > 1) what if the icmp is not directly used by an @llvm.assume? e.g., if
> > we proved a >= 0 and that condition is used in assume(a >= 0 || b >=
> > 0), should we keep (a >= 0) in case later passes use them? If yes,
> > we would probably have to recursively traverse def-use chains. If
> > no, some assumption is again lost after instcombine.
> >
> >
> > 2) what if the kept assumes are not leveraged later at all? These
> > assumes bump up values' refcounts and could potentially hurt
> > optimizations. This looks like a problem for adding @llvm.assume in
> > general. Maybe users should be aware of these trade-offs when adding
> > __builtin_assumes in their source code.
>
> Yes, this is the established tradeoff. The LangRef contains a warning
> about this explicitly:
>
> "Note that the optimizer might limit the transformations performed on
> values used by the llvm.assume intrinsic in order to preserve the
> instructions only used to form the intrinsic’s input argument. This might
> prove undesirable if the extra information provided by the llvm.assume
> intrinsic does not cause sufficient overall improvement in code quality.
> For this reason, llvm.assume should not be used to document basic
> mathematical invariants that the optimizer can otherwise deduce or facts
> that are of little use to the optimizer."
>
>  -Hal
>
> >
> >
> > Thoughts?
> >
> >
> > Thanks,
> > Jingyue
>
> --
> Hal Finkel
> Assistant Computational Scientist
> Leadership Computing Facility
> Argonne National Laboratory
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20150610/af76a9fe/attachment.html>


More information about the llvm-dev mailing list