[PATCH] D91751: [NFC][TFUtils] Extract out the output spec loader

Yundi Qian via Phabricator via llvm-commits llvm-commits at lists.llvm.org
Wed Nov 18 18:05:02 PST 2020


yundiqian added inline comments.


================
Comment at: llvm/include/llvm/Analysis/Utils/TFUtils.h:204-207
   TFModelEvaluator(StringRef SavedModelPath,
                    const std::vector<TensorSpec> &InputSpecs,
                    const std::vector<TensorSpec> &OutputSpecs,
                    const char *Tags = "serve");
----------------
mtrofin wrote:
> yundiqian wrote:
> > Do we still need this function?
> We can keep it as a lower primitive.
Is it a higher primitive? This function calls TFModelEvalutor(..., function_ref<TensorSpec(size_t)> GetOutputSpecs,, ...)


================
Comment at: llvm/lib/Analysis/DevelopmentModeInlineAdvisor.cpp:486
+  Evaluator = std::make_unique<TFModelEvaluator>(
+      ModelPath, InputSpecs, [&](size_t I) { return OutputSpecs[I].Spec; },
+      OutputSpecs.size());
----------------
mtrofin wrote:
> yundiqian wrote:
> > why not passing in OutputSpecs?
> because those are LoggedFeatureSpecs, so they include the extra name for logging. The evaluator doesn't care about that, and we shouldn't muddy the abstraction layers.
How about making a copy into std::vector<TensorSpec&> outputspec? It would make the code much easier


Repository:
  rG LLVM Github Monorepo

CHANGES SINCE LAST ACTION
  https://reviews.llvm.org/D91751/new/

https://reviews.llvm.org/D91751



More information about the llvm-commits mailing list