Skip to content
This repository has been archived by the owner on Apr 10, 2024. It is now read-only.

Support for tf.SavedModel #46

Open
ludwigschubert opened this issue Mar 27, 2018 · 6 comments
Open

Support for tf.SavedModel #46

ludwigschubert opened this issue Mar 27, 2018 · 6 comments
Assignees
Labels
enhancement New feature or request lucid.modelzoo

Comments

@ludwigschubert
Copy link
Contributor

It looks like modelzoo could directly support the new SavedModel standard. We would still need the metadata entries in modelzoo, but no longer require manual freezing of Variables into Constants in the graph definition.

@xn8812
Copy link

xn8812 commented Jun 8, 2018

How to exactly load the TF saved model in lucid? Thanks.

@harritaylor
Copy link

Has there been an update on this? I'm currently trying to wrap my head around freezing a tf-slim model and having no luck. Thanks.

@ludwigschubert
Copy link
Contributor Author

ludwigschubert commented Apr 18, 2019

@xn8812 @harritaylor while I don't think I can do proper SavedModel support until TF2.0, we recently added a simple way to save a model in a lucid-compatible format. :-)

I'll add more documentation as we use this functionality more, but you may already want to give it a try!
You would need to install lucid from HEAD for now: pip install --upgrade git+https://github.com/tensorflow/lucid.git@HEAD

Saving/exporting

The main reason importing and exporting models is so complex is that Lucid needs to know some additional metadata about your model. We now save this information in the exported model, so you need to provide it to Lucid at the time of exporting. Model.save relies on the default tf.Session, so usually you would call it within a contextmanager (with tf.Session()...). Here's the new API:

from lucid.modelzoo.vision_base import Model

with tf.Session.as_default():
    # ...load or construct your own model
    Model.save("path/to/graphdef.pb", input_name, output_names, image_shape, image_value_range)
  • input_name: string The name of a tensor in your graph that contains the input image, usually a tf.placeholder, or a node right before your first convolutional layer. During import, lucid will replace this node with its input parameterizations. (See the lucid/optvis/param package)
  • output_names: [string] The names of nodes in your graph that produce your model's final output, often called "logits" or "predictions" or similar names.
  • image_shape: (width, height, depth) The shape your model expects its input images to be. If your model accepts flexible shapes, use a size representative of image sizes your model was trained on.
  • image_value_range: (min, max) The range of values your model expects as its input. Older architectures were sometimes trained on value ranges like (0,255) (because training images were read from files and kept as uint8s) or (-127, 127) (to center those values), while these days we mostly see (0,1) or (-1,1) float values. You'd get those if you divide the values from a training image by 255.0.

To help you find out these parameters, we built some heuristics, though they may not work in all cases:

from lucid.modelzoo.vision_base import Model

with tf.Session.as_default():
    # ...load or construct your own model
    Model.suggest_save_args()

We hope the output can be helpful in certain cases:

Inferred: input_name = input (because it was the only Placeholder in the graph_def)
Inferred: image_shape = [16, 16, 3]
Inferred: output_names = ['Softmax']  (because those are all the Softmax ops)
# Please sanity check all inferred values before using this code.
# Incorrect `image_value_range` is the most common cause of feature visualization bugs! Most methods will fail silently with incorrect visualizations!"
Model.save(
    input_name='input',
    image_shape=[16, 16, 3],
    output_names=['Softmax'],
    image_value_range=_,   # TODO (eg. '[-1, 1], [0, 1], [0, 255], or [-117, 138]')
  )

Loading

This is now easy since the saved graphdef contains all the necessary information:

from lucid.modelzoo.vision_base import Model
model = Model.load("path/to/graphdef.pb")

@harritaylor
Copy link

@ludwigschubert fantastic!! Thank you very much. I managed to get the tf-slim model imported the old way after a few hours of slugging through it, but this method worked straight away. I'm now having issues with creating a greyscale visualisation for an odd shaped image, but I don't think that would warrant an issue...

@Kaustubh-Khatri
Copy link

@ludwigschubert Looks like Model.save has been removed. Can you please tell if there is any alternative to it?

@colah
Copy link
Contributor

colah commented Sep 25, 2019

Model.save() is still present: https://github.com/tensorflow/lucid/blob/master/lucid/modelzoo/vision_base.py#L299

Is it possible you're using a really old version of lucid?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
enhancement New feature or request lucid.modelzoo
Projects
None yet
Development

No branches or pull requests

5 participants