[llvm] [InstrProf] Adding utility weights to BalancedPartitioning (PR #72717)
Ellis Hoag via llvm-commits
llvm-commits at lists.llvm.org
Mon Nov 20 09:30:10 PST 2023
================
@@ -933,8 +933,15 @@ std::vector<BPFunctionNode> TemporalProfTraceTy::createBPFunctionNodes(
std::vector<BPFunctionNode> Nodes;
for (auto &Id : FunctionIds) {
auto &UNs = FuncGroups[Id];
- llvm::sort(UNs);
- UNs.erase(std::unique(UNs.begin(), UNs.end()), UNs.end());
+ llvm::sort(UNs.begin(), UNs.end(),
+ [](const UtilityNodeT &L, const UtilityNodeT &R) {
+ return L.id < R.id;
+ });
+ UNs.erase(std::unique(UNs.begin(), UNs.end(),
+ [](const UtilityNodeT &L, const UtilityNodeT &R) {
+ return L.id == R.id;
+ }),
+ UNs.end());
----------------
ellishg wrote:
Now that we have weights, do you think it makes sense to accumulate the weights of duplicated nodes instead? For example, if we had nodes `{(A, 1), (B, 2), (B, 3), (C, 2)}` then we would end up with `{(A, 1), (B, 5), (C, 2)}`.
I imagine this might be useful for compression. If a function has many duplicate instructions, it makes sense to weight that more than other instructions that appear once.
https://github.com/llvm/llvm-project/pull/72717
More information about the llvm-commits
mailing list