[llvm-dev] Move InlineCost.cpp out of Analysis?

Philip Reames via llvm-dev llvm-dev at lists.llvm.org
Mon Apr 18 16:32:04 PDT 2016



On 04/18/2016 03:00 PM, Chandler Carruth via llvm-dev wrote:
> On Mon, Apr 18, 2016 at 2:48 PM Hal Finkel <hfinkel at anl.gov 
> <mailto:hfinkel at anl.gov>> wrote:
>
>
>
>     ------------------------------------------------------------------------
>
>         *From: *"Xinliang David Li" <davidxl at google.com
>         <mailto:davidxl at google.com>>
>
>         On Mon, Apr 18, 2016 at 2:33 PM, Mehdi Amini
>         <mehdi.amini at apple.com <mailto:mehdi.amini at apple.com>> wrote:
>
>             In the current case at stake: the issue is that we can't
>             make the Analysis library using anything from the
>             ProfileData library. Conceptually there is a problem IMO.
>
>
>
>         Yes -- this is a very good point.
>
>     Independent of anything else, +1.
>
>
> The design of ProfileData and reading profile information in the 
> entire middle end had a really fundamental invariant that folks seem 
> to have lost track of:
>
> a) There is exactly *one* way to get at profile information from 
> general analyses and transforms: a dedicated analysis pass that 
> manages access to the profile info.
>
> b) There is exactly *one* way for this analysis to compute this 
> information from an *external* profile source: profile metadata 
> attached to the IR.
>
> c) There could be many external profile sources, but all of them 
> should be read and then translated into metadata annotations on the IR 
> so that serialization / deserialization preserve them in a common 
> format and we can reason about how they work.
>
>
> This layering is why it is only a transform that accesses ProfileData 
> -- it is responsible for annotating the IR and nothing else. Then the 
> analysis uses these annotations and never reads the data directly.
>
> I think this is a really important separation of concerns as it 
> ensures that we don't get an explosion of different analyses 
> supporting various different subsets of profile sources.
>
>
> Now, the original design only accounted for profile information 
> *within* a function body, clearly it needs to be extended to support 
> intraprocedural information. But I would still expect that to follow a 
> similar layering where we first read the data into IR annotations, 
> then have an analysis pass (this time a module analysis pass in all 
> likelihood) that brokers access to these annotations through an API 
> that can do intelligent things like synthesizing it from the "cold" 
> attribute or whatever when missing.
>
> -Chandler
+1 to this.

p.s. I have my own source of profiling information which I translate 
into metadata.  The fact that "just works" is valuable and is definitely 
a design goal we should retain.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/llvm-dev/attachments/20160418/2c496521/attachment.html>


More information about the llvm-dev mailing list