Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trying to create frozen inference graph. Multiple nodes with the names 'cls_fc' and 'cls_prob' & 'bbox_fc'. Any help? #66

Closed
karansomaiah opened this issue Jan 11, 2019 · 5 comments

Comments

@karansomaiah
Copy link

Hi everyone,
I have a set of saved checkpoints but in order to do inference, I have to always load the test.py file, modify it a little bit and accordingly make inference. But, it would be much easier if there was a way to convert them to a frozen graph. But in order to freeze the graph, we need nodes corresponding to the bounding box and class scores. It looks like 'cls_fc', 'cls_prob' and 'bbox_fc' are the nodes of interest but I do not see them in the nodes of the graph.

Printing nodes using:
[n for n in tf.get_default_graph().as_graph_def().node]

If anybody could help me with identifying the respective output nodes, it will be great.
Thanks!

@karansomaiah
Copy link
Author

I've solved the problem by creating identity tensors of the respective names and grabbing the respective outputs mentioned in the network_desp file. Closing the issue.

@ChienLiu
Copy link

ChienLiu commented Jan 31, 2019

Hello @karansomaiah,
Can you share the code to freeze the graph?
I think I also meet some problems of this process.

As your question, did you use the 'save/restore_all', or other?

Thanks a lot !

@karansomaiah
Copy link
Author

karansomaiah commented Feb 4, 2019

Hey @ChienLiu
I have used the following code to create the frozen inference graph.
Having done that, I'd also like to warn you that while generating the graph from the frozen graph for inference, you will have to load the network_desp.py file again since it doesn't know a lot of the custom operations. But I'm still working on converting them to tensorflow operations. Will let you know if I find something.
Till then,

from config import cfg, config

import argparse
import network_desp
import tensorflow as tf
import numpy as np
import os, sys, math, json, pickle

from tensorflow.python.tools import freeze_graph

def make_parser():
    parser = argparse.ArgumentParser('Inference Graph Writer')
    parser.add_argument(
        '-f', '--file', 
        default='path/to/checkpoint', help='checkpoint file')
    parser.add_argument(
        '--output_dir', '-o', 
        default='path/to/store/frozen_graphs/')
    parser.add_argument(
        '--devices', '-d',
        default='0'
    )
    return parser

def create_inference_graph(checkpoint_file, output_dir, dev):
    
    # Create the inference graph using the checkpoint file, the devices and the output directory.
    # Args:
    # checkpoint_file -> path to the checkpoint file
    # output_dir -> directory where the frozen inference graph must be stored
    # dev -> number of devices provided as an argument
    
    # reset default graph
    tf.reset_default_graph()
    os.environ["CUDA_VISIBLE_DEVICES"] = dev

    # tf-config stuff
    tfconfig = tf.ConfigProto(allow_soft_placement=True)
    tfconfig.gpu_options.allow_growth = True
    
    # load session
    sess = tf.Session(config=tfconfig)

    # network definition
    net = network_desp.Network()
    inputs = net.get_inputs()
    net.inference('TEST', inputs)

    # get test tensors collection
    test_collect_dict = net.get_test_collection()
    
    ## create identity ops here for the test collection
    outputs = get_output_nodes(test_collect_dict)

    ## start creating the inference graph from this point using some definitions
    saver = tf.train.Saver()
    input_saver_def = saver.as_saver_def()
    frozen_graph_path = os.path.join(output_dir, 'frozen_inference_graph_1.pb')

    # get the output key names
    output_node_names = ','.join(outputs.keys())
    print("Identified output names are:", output_node_names)
    print("tf.get_collection():")
    for var in  tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES):
        print(var)
    
    # printing DEBUG stuff
    #print(tf.get_default_graph().get_operations())

    # save the model
    frozen_graph_def = freeze_graph.freeze_graph_with_def_protos(
        input_graph_def=tf.get_default_graph().as_graph_def(),
        input_saver_def=input_saver_def,
        input_checkpoint=checkpoint_file,
        output_node_names=output_node_names,
        restore_op_name='save/restore_all',
        filename_tensor_name='save/Const:0',
        output_graph=frozen_graph_path,
        clear_devices=True,
        initializer_nodes='')
    
    print("Saved Model @ ", frozen_graph_path)
    

def get_output_nodes(test_collection_tensors, 
                    output_collection_name='inference_op'):
    
    # Function to get output nodes like in exporter.py in TF Object Detection API
    # Args:
    # test_collection_tensors -> retrieved from net.get_test_collection()
    # Returns:
    # outputs -> dictionary containing keys and the respective nodes
    
    outputs = {
        output_keys_: tf.identity(output_vals, name=output_keys_)
        for output_keys_, output_vals in test_collection_tensors.items()
        }

    for output_key in outputs:
        tf.add_to_collection(output_collection_name, outputs[output_key])

    return outputs


if __name__ == '__main__':
    parser = make_parser()
    args = parser.parse_args()
    create_inference_graph(args.file, args.output_dir, args.devices)

@ChienLiu
Copy link

Thanks for your help.
I'm trying to use tensorRT to optimize the network.
It is a big challenge for me, I think.

@karansomaiah
Copy link
Author

Have you been able to do it @ChienLiu ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants