Skip to content

how to reuse Variables if I am adding node to the graph of deeplab? #8897

@zheyuanWang

Description

@zheyuanWang

Also regarding the modification: since you are creating a the first layer (which outputs feature_2) under different variable scope, my bet is it would not be shared with the layer that creates feature_1 ;)

Originally posted by @YknZhu in #8864 (comment)


Hi,

I want to reuse some layers of the pre-defined mobilenet-v3 backbone, so I made some modification in research/slim/nets/mobilenet/mobilenet.py

basically like this:

with tf.variable_scope('parallel_layer',reuse=tf.compat.v1.AUTO_REUSE) as scope:
   try:
        net1 = opdef.op(net1, **params)
        net2 = opdef.op(net2, **params)
   except Exception:
        print('Failed to create op %i: %r params: %r' % (i, opdef, params))

Ps. I also tried without the extra variable_scope. I.e. "MobilenetV3" instead of "MobilenetV3/parallel_layer".

According to the graph in tensorboard, I do have two nodes (Conv & Conv_1) instead of only one node (Conv) now. I can't load pre-trained checkpoint anyway since the graph was changed.

How to reuse the layers correctly? And how to know weather I am correct?

Metadata

Metadata

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions