[llvm-dev] Does poison add implicit "definedness" under the hood ?
Juneyoung Lee via llvm-dev
llvm-dev at lists.llvm.org
Wed Oct 28 19:29:37 PDT 2020
Hi Stefanos,
So, to justify this transformation as correct, implicitly, poison has
> _added definedness_ to signed wrapping: specifically, that the
> computer won't explode if SW happens. AFAIU, that is ok as far as C++
> semantics
> are concerned:
> Since signed wrapping was UB, making it more defined is ok.
Your understanding is correct. Since signed overflow is UB in C/C++,
lowering from C to IR can make such programs more defined.
Instead, they have to lower it as something like:
> if (x == INT_MAX)
> skip or whatever
Yes.
This means that the overflow check and the actual add operation can be
separated. This requires instruction selection to carefully combine the
check and add, but for optimization this is beneficial because add can
still be freely moved.
for(i < n) {
if (x == INT_MAX)
trap
y = add nsw x + 1
use(y)
}
=>
y = add nsw x + 1 // hoisted
for(i < n) {
if (x == INT_MAX)
trap
use(y)
}
Juneyoung
On Thu, Oct 29, 2020 at 5:56 AM Stefanos Baziotis <
stefanos.baziotis at gmail.com> wrote:
> Hi Juneyoung,
>
> First of all, great job on your talk!
>
> This is a question I guess you'd be the best person to answer but the rest
> of the LLVM community might want to participate.
>
> I was thinking about a UB-related example that has been discussed by
> multiple people
> (including you), all of them basically authors of this paper (
> https://www.cs.utah.edu/~regehr/papers/undef-pldi17.pdf):
>
> -- Before opt:
> for (int i = 0; i < n; ++i) {
> a[i] = x + 1;
> }
>
> -- After opt (LICM):
> int tmp = x + 1;
> for (int i = 0; i < n; ++i) {
> a[i] = tmp;
> }
> // Assume `tmp` is never used again.
>
> The reasoning here, is let's make signed wrapping _deferred_ UB that will
> only
> occur if the value is used in one of X ways (e.g. as a denominator). To
> that end, if
> n == 0 and x == INT_MAX, UB will never occur because the value is never
> used.
>
> But, by doing that, the first point is:
> If we translate this into machine code, the signed wrapping _will_ happen,
> no matter
> the value won't be used.
>
> Now, imagine that on some platform P, signed wrapping explodes the
> computer.
> The computer _will_ explode (should explode ? more on that later)
> even if `n == 0`, something that would not happen in the original code.
>
> So, to justify this transformation as correct, implicitly, poison has
> _added definedness_ to signed wrapping: specifically, that the
> computer won't explode if SW happens. AFAIU, that is ok as far as C++
> semantics
> are concerned:
> Since signed wrapping was UB, making it more defined is ok.
>
> But that definedness now has created a burden to whoever is writing a
> back-end
> from LLVM IR to P (the SW exploding platform).
> That is, now, if they see a `add <nsw>`, they can't lower it to a trivial
> signed add,
> since if they do that and x == INT_MAX, the computer will explode and that
> violates
> the semantics of _LLVM IR_ (since we mandated that SW doesn't explode the
> machine).
>
> Instead, they have to lower it as something like:
> if (x == INT_MAX)
> skip or whatever
>
> Is this whole thinking correct ? UB, undef and poison all are very subtle
> so I'm trying
> to wrap my head around.
>
> Thanks,
> Stefanos Baziotis
>
--
Juneyoung Lee
Software Foundation Lab, Seoul National University
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20201029/68cc5ad6/attachment.html>
More information about the llvm-dev
mailing list