<div dir="ltr"><div dir="ltr"><br></div><div class="gmail_quote"><div dir="ltr" class="gmail_attr">On Tue, May 7, 2019 at 7:16 AM Kevin Neal <<a href="mailto:Kevin.Neal@sas.com">Kevin.Neal@sas.com</a>> wrote:<br></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><br>
How should we handle regressions in code quality that are exposed by tests?<br></blockquote><div><br></div><div>One idea I had (or I may be remembering someone else's suggestion) was to grep the existing IR regression tests for 'fsub -0.0', then create a sibling test using fneg. If that doesn't result in the same output, then we need to do some pattern matching enhancement.<br></div></div></div>