[PATCH] D83733: [llvm] Development-mode InlineAdvisor
David Li via Phabricator via llvm-commits
llvm-commits at lists.llvm.org
Thu Jul 16 20:10:02 PDT 2020
davidxl added inline comments.
================
Comment at: llvm/lib/Analysis/DevelopmentModeInlineAdvisor.cpp:30
+
+static cl::opt<std::string> TFTrainedModelPath(
+ "ml-inliner-trained-model", cl::Hidden,
----------------
TrainedModel sounds misleading as if it has been trained. Perhaps just TrainModel or ModelInTrain? Or TrainingModel?
================
Comment at: llvm/lib/Analysis/DevelopmentModeInlineAdvisor.cpp:131
+/// and can default to the default policy when we need to produce training logs
+/// from it.
+class DevelopmentModeMLInlineAdvisor : public MLInlineAdvisor {
----------------
May be documenting like this:
The advisor operates in two modes: 1) log collection, and 2) inference (during training). In the first mode, the default advisor is used while in 2), the MLInlineAdvisor is used.
================
Comment at: llvm/lib/Analysis/DevelopmentModeInlineAdvisor.cpp:132
+/// from it.
+class DevelopmentModeMLInlineAdvisor : public MLInlineAdvisor {
+public:
----------------
Naming: MLInlineAdvisorForTraining or TrainingModeInlineAdvisor
================
Comment at: llvm/lib/Analysis/DevelopmentModeInlineAdvisor.cpp:256
+/// DynamicModelRunner - training mode implementation. It uses TF C APIs to
+/// dynamically load and evaluate a TF SavedModel. Runtime performance is
+/// sacrificed for ease of use while training.
----------------
SavedModel --> TrainedModel or ModelBeingTrained
================
Comment at: llvm/lib/Analysis/DevelopmentModeInlineAdvisor.cpp:258
+/// sacrificed for ease of use while training.
+class DynamicModelRunner final : public MLModelRunner {
+public:
----------------
Perhaps change DynamicModel name to be TrainModel to make the name consistent with the option.
================
Comment at: llvm/lib/Analysis/DevelopmentModeInlineAdvisor.cpp:356
+ CallBase &CB, OptimizationRemarkEmitter &ORE) {
+ if (IsDoingInference && !isLogging())
+ return MLInlineAdvisor::getAdviceFromModel(CB, ORE);
----------------
can Inference and Logging both be on?
================
Comment at: llvm/lib/Analysis/DevelopmentModeInlineAdvisor.cpp:448
+
+ bool DoesInference = false;
+ if (TFTrainedModelPath.empty())
----------------
IsDoingInference
Repository:
rG LLVM Github Monorepo
CHANGES SINCE LAST ACTION
https://reviews.llvm.org/D83733/new/
https://reviews.llvm.org/D83733
More information about the llvm-commits
mailing list