Skip to content
This repository has been archived by the owner on Jun 13, 2020. It is now read-only.

Fix problematic case with opts==None and therefore no opts.use_dropout. #3

Closed
wants to merge 2 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 1 addition & 1 deletion base_network.py
Expand Up @@ -66,7 +66,7 @@ def hidden_layers_starting_at(self, layer, layer_sizes, opts=None):
num_outputs=size,
weights_regularizer=tf.contrib.layers.l2_regularizer(0.01),
activation_fn=tf.nn.relu)
if opts.use_dropout:
if opts!=None and opts.use_dropout:
layer = slim.dropout(layer, is_training=IS_TRAINING, scope="do%d" % i)
return layer

Expand Down
2 changes: 1 addition & 1 deletion event_log.py
@@ -1,5 +1,5 @@
#!/usr/bin/env python
import event_pb2
from tensorflow.core.util import event_pb2
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is actually importing a different proto; i'm wanting to use the event.proto at the root of this repro.

it's a bit hacky as it is now; this code requires you run protoc event.proto --python_out=. (as mentioned in the README.md) before the event_pb2.py is created. what i should do is try the import and, if it fails, give a warning that this code needs to be run (either that or make it a proper part of some kind of setup.py init...

import gzip
import matplotlib.pyplot as plt
import numpy as np
Expand Down