[PATCH] D11442: Create a utility function normalizeEdgeWeights() in BranchProbabilityInfo that is used to normalize a list of weights so that the sum of them does not exceed UINT32_MAX.

Cong Hou congh at google.com
Thu Jul 23 14:13:02 PDT 2015


congh added inline comments.

================
Comment at: include/llvm/Analysis/BranchProbabilityInfo.h:165
@@ +164,3 @@
+  // If the computed sum fits in 32-bits, we're done.
+  if (Sum <= UINT32_MAX)
+    return 1;
----------------
davidxl wrote:
> congh wrote:
> > davidxl wrote:
> > > Why not doing normalization unconditionally to avoid precision loss introduced by incremental update (for small weights)?
> > The incremental update itself can deal with precision loss, then this function can be called to guarantee that the sum of weights won't exceed UINT32_MAX.
> > 
> > If we decide to normalize all edge weights with scaling up, we should call this normalization function every time we set or update the edge weight. Have we decided this?
> You will be needing this for precision preservation purpose, so this interface needs to take an extra argument to decide if up-scaling needs to be done -- by default it can be off.
OK. But I am wondering how to tell if the returned scale is for scaling up or down? This is the difficult part...


http://reviews.llvm.org/D11442







More information about the llvm-commits mailing list