<table border="1" cellspacing="0" cellpadding="8">
    <tr>
        <th>Issue</th>
        <td>
            <a href=https://github.com/llvm/llvm-project/issues/108828>108828</a>
        </td>
    </tr>

    <tr>
        <th>Summary</th>
        <td>
            Miscompilation (bad optimization) - results change at `-O1`
        </td>
    </tr>

    <tr>
      <th>Labels</th>
      <td>
            new issue
      </td>
    </tr>

    <tr>
      <th>Assignees</th>
      <td>
      </td>
    </tr>

    <tr>
      <th>Reporter</th>
      <td>
          ericastor
      </td>
    </tr>
</table>

<pre>
    I've had trouble minimizing this particular issue - `bugpoint` makes changes that start causing timeouts, and I don't know which passes are even at fault - but I've got an example of LLVM-IR code that produces different results (when run through either `lli` and `llc`) before and after optimization at `-O1`.

Before optimization, the attached file executes and exits with code 0; after optimization, it executes & exits with code 1.

Example code: [linked.ll.txt](https://github.com/user-attachments/files/17014317/linked.ll.txt)

</pre>
<img width="1px" height="1px" alt="" src="http://email.email.llvm.org/o/eJx8U01vnDwQ_jXmMmJlDMvCgUP2zbtSpESVeujdmAG7MTayx7vb_voKiJqmrXrBAjzzfMwzMkYzOcSOHc_s-JjJRNqHDoNRMpIPWe-Hb90TE6crgpYDUPCptwizcWY2342bgLSJsMhARiUrA5gYE0IOrOZ9mhZvHLGawyxfMYLS0k0YgbQkiCQDgZIpbn3MjD5RZOI_kG6AJxi8Y-JE8Or8DW7aKA2LjBEjyICAV3QgCUaZLEEOfSJ4Yzp5AukA73JeLIIf4fn5y0v-9BmUH3AHX4IfksIIgxlHDOgIAsZkKQITzU2jg5AckA4-TRrQkMawirLWrHpWitubYjVnooUeRx9w-y5HwgB-odUjScZvRFnN808Fq_mB8UfGH_bneS_79fLqAGkESSSVxgFGYxHwjirRKt4NgHdDEW6G9C6Js_L8F9i1k6H3UibqP0qLD3T-f_Ns_cXKB2DHszXuFYeDtQe6Ezs-MtFooiWy8oGJCxOXyZBO_UH5mYlLihjynfmMbh3nZaW_nsWJF1VZnJi4fOwp2h08G7pyaMtWZtgVJ1FXbdUInulu5C22UpSjqMamwUYNx1FhW5zGouUC28x0gouKt0VdlMey4gfJeclLUTS8OhayObGK4yyNPVh7nQ8-TNmW067gTSOazMoebdzWQAiHtz3FTIh1K0K3FuV9miKruDWR4nsbMmSxezFR-Xkxdp82E00vh99G0UL-M2L7HnxIRZaC7f5h7Ir4duRL8F9REROXjedm7i7k2okfAQAA__8yk0GH">