Skip to content
This repository has been archived by the owner on Jan 10, 2023. It is now read-only.

Biggan expects whole integer as a label rather than a tensor #29

Closed
CrackerHax opened this issue Jun 11, 2019 · 1 comment
Closed

Biggan expects whole integer as a label rather than a tensor #29

CrackerHax opened this issue Jun 11, 2019 · 1 comment

Comments

@CrackerHax
Copy link

CrackerHax commented Jun 11, 2019

Seems that biggan can only be used unconditionally. This might be the case for infogan too. As far as I can tell, compare_gan does not make use of labels at all.

Here is my parse function:

  def parse_function(self, proto):
        feature_map = tf.parse_single_example(
            proto,
            features = {'image': tf.FixedLenFeature([], tf.string),
                        'label': tf.FixedLenSequenceFeature([NUM_CLASSES], tf.int64, allow_missing=1, default_value=[0])
            }
            )

        image = tf.decode_raw(feature_map['image'], tf.uint8)
        image = tf.reshape(image, (IMAGE_SIZE,IMAGE_SIZE,3))
        image = tf.cast(image, tf.float32) / 255.0

        label = feature_map['label']
        label = tf.cast(label, tf.int32)
        label = tf.reshape(label, (-1,NUM_CLASSES))

        return image, label

This starts throwing errors when the layers are being set up. It will only work if I return label as an integer (not a tensor).

@Marvin182
Copy link
Contributor

If you are doing one-class classification then you can pass around the label ID (scalar) instead of vector. ModularGAN._get_one_hot_labels will then encode the label into a one hot encoded vector.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants