[PATCH] D10979: Update the branch weight metadata in JumpThreading pass.

David davidxl at google.com
Wed Jul 22 16:24:17 PDT 2015


davidxl added inline comments.

================
Comment at: lib/Transforms/Scalar/JumpThreading.cpp:1599
@@ +1598,3 @@
+                       << BB->getName() << " to " << SuccBB->getName() << "\n");
+          if (PredBBFreq <= BB2SuccBBFreq) {
+            // Here we record the index of SuccBB in successors of BB and
----------------
congh wrote:
> davidxl wrote:
> > It is unclear how can this happen.
> Consider the following CFG and assume the edge frequency is 100 for each edge:
> 
> 
> ```
> A   B
>  \ /
>   C
>  / \
> D   E
> ```
> 
> In some pass C may be split into two blocks:
> 
> 
> ```
>  A   B
>  |   |
>  C1 C2
>  | X |
>  D   E
> ```
> 
> Then edges from C1/C2 have frequency 50. Now suppose in jump-threading pass we thread B->E across C2 (C2->D is impossible at runtime), then B has frequency 100 but C2->E has frequency 50. This is when this condition is not met.
What I meant is that  at runtime Pred->BB can only be followed by BB->Succ, then Freq(BB->SuccBB) can be proved to be no less than Freq(PredBB->BB) -- so that check is unnecessary. 

================
Comment at: lib/Transforms/Scalar/JumpThreading.cpp:1614
@@ +1613,3 @@
+    if (SuccIndexToUpdate >= 0) {
+      // We need to update the weights of BB's successors.
+      // Instead of scaling the weight of BB->SuccBB by 1 - Freq(PredBB->BB) /
----------------
congh wrote:
> davidxl wrote:
> > The scaling code looks unnecessarily complicated. How about this:
> > 
> > a) Pre-normalization 
> >   1) first sum up all the weights before the update
> >   2) if the sum is too small, scale it up to a fixed val such as ScaledNumber<64>(1,20) 
> >   3) compute the scale factor and rescale the weights
> > 
> > b) update BB->SuccBB with (1- Freq(Pred->BB)/Freq(BB->SuccBB))
> > 
> > c) post-normalization: scale all weights such that the weight sum is 2^20
> > 
> > Some helper routine can be defined.
> > 
> > 
> > 
> > 
> This method looks good. I have two questions:
> 
> 1. How to tell if the sum is too small? Can we just normalize all weights such that the sum of them is 2^20?
> 2. Is c) necessary?
1) I think it is fine to do normalization uncondionally 

2) c is not strictly necessary. However I do expect in the future we can assert that the weight sum at any given time is always normalized.


http://reviews.llvm.org/D10979







More information about the llvm-commits mailing list