Skip to content
Permalink
Browse files

Allow Quantized output, save Function's name.

Summary: Pull Request resolved: #2883

Differential Revision: D15296815

Pulled By: artemrakhov

fbshipit-source-id: 24c6fca8ad5d2374f485ff28255cd27d041eb5f5
  • Loading branch information...
artemrakhov authored and facebook-github-bot committed May 10, 2019
1 parent 6195409 commit 36e709c3763552e62d395a772fc720c96530c687
Showing with 13 additions and 2 deletions.
  1. +13 −2 tools/loader/ModelRunner.cpp
@@ -45,6 +45,8 @@ int main(int argc, char **argv) {
Placeholder *output = EXIT_ON_ERR(LD->getSingleOutput());
auto *outputT = bindings.allocate(output);

std::string modelName = loader.getFunction()->getName().str();

// Compile the model, and perform quantization/emit a bundle/dump debug info
// if requested from command line.
loader.compile(bindings);
@@ -53,10 +55,19 @@ int main(int argc, char **argv) {
if (!emittingBundle()) {
loader.runInference(bindings);

llvm::outs() << "Model: " << loader.getFunction()->getName() << "\n";
llvm::outs() << "Model: " << modelName << "\n";

// Print out the result of output operator.
outputT->getHandle().dump();
switch (outputT->getElementType()) {
case ElemKind::FloatTy:
outputT->getHandle<float>().dump();
break;
case ElemKind::Int8QTy:
outputT->getHandle<int8_t>().dump();
break;
default:
GLOW_UNREACHABLE("Unexpected output type");
}

// If profiling, generate and serialize the quantization infos now that we
// have run inference to gather the profile.

0 comments on commit 36e709c

Please sign in to comment.
You can’t perform that action at this time.