[cfe-dev] -Wunreachable-code and templates

Ted Kremenek kremenek at apple.com
Wed Feb 15 10:34:53 PST 2012


On Feb 14, 2012, at 4:35 PM, David Blaikie <dblaikie at gmail.com> wrote:

> So I've come up with some numbers for feedback.
> 
> Experiment:
> * Take the attached templates.cpp (it includes every C++11 header
> clang can compile (I was using libstdc++ for this), plus a bunch of
> boost headers - and a little hand-written sanity check)
> * preprocess it (with -std=c++11)
> * strip out the line directives (so that clang won't suppress all the
> diagnostics because they're in system headers)
> * using "perf stat -r100":
>  * run with -Wno-everything and either with or without
> -Wunreachable-code -c -std=c++11
>  * run the same thing with the template_unreachable.diff applied
> (I did this at runlevel 1 to try to reduce some of the noise/background)
> This is basically the worst case I can think of - lots of templated
> code, almost no instantiations or non-templated code.
> 
> Results
> * you should observe relatively similar results to mine in results.txt except:
>  * I pruned the 100 repetitions of diagnostics from the results file,
> leaving only one sample of the diagnostic output from the two
> -Wunreachable-code runs (with & without my patch applied)
>  * I added the check to not build the CFG if we're in a dependent
> context and -Wunreachable-code is /not/ enabled after I'd already run
> my perf numbers, so the discrepency in my results between the two
> versions without the flag enabled should probably not be there (I can
> rerun that to demonstrate if desired)
> 
> & these were the execution time results I got:
> 
> No patch
>  No warning: 1.915069815 seconds ( +-  0.02% )
>  Warning (10 found): 1.923400323 seconds ( +-  0.02% )
> With patch
>  No warning: 1.937073564 seconds ( +-  0.03% ) (this should probably
> be closer to/the same as the first result - it was just run without
> the shortcut as called out above)
>  Warning (20 found - including my sanity check): 1.980802759 seconds
> ( +-  0.03% )
> 
> So about a 3% slow down (in the enabled case), according to this
> experiment which is a pretty extreme case.
> 
> What do you reckon? Good? Bad? Need further data/better experiments?

Hi David,

Thanks for doing these measurements.  I think we need to do more investigation.

If you told me that we have the distinct possibility of incurring a 3% compile time regression, I'd say that this warning would *never* be on by default.  A 3% regression is *huge* for a single warning.  Your measurements, however, are completely biased by your test case.  I think this test shows that the analysis doesn't become ridiculously slow in the worst case (a good property), but it doesn't tell me what the real performance regression is going to be on normal code.  For example, what is the build time impact on the LLVM codebase, Chrome, or building code that uses Boost?  That's the kind of measurement I think we need to see.  There's going to be a lot more variance there, but at the end of the day these measurements don't tell me the performance regression I should expect to see from this warning.

As for the warnings found, what was the false positive rate?  We're these real issues?

Cheers,
Ted
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/cfe-dev/attachments/20120215/92e7d165/attachment.html>


More information about the cfe-dev mailing list