Skip to content

The validation accuracy of Cifar-10 is almost 99%  #694

@sicnarf1a

Description

@sicnarf1a

Hi.

I modified DoReFa-Net and applied it to Cifar10.

And I mistakenly omitted the gradient_override_map and quantized the last activation of the network, but the validation accuracy is over 99%.

I admit that there is a problem with the code, but how do I interpret that the validation accuracy is over 99%?

Here is the code:

...
...
...
with tf.name_scope('last_layer') as scope:
    l = tf.matmul(l, kernel)
    l = tf.nn.bias_add(l, biases)

    # I omitted the code
    # with G.gradient_override_map({"Round": "Identity"})
    l = tf.round(tf.clip_by_value(l, 0.0, 1.0) * float(2**BITA - 1)) / float(2**BITA - 1)
logits = tf.identity(l)

and results:

[0313 15:03:26 @monitor.py:363] DataParallelInferenceRunner/QueueInput/queue_size: 50
[0313 15:03:26 @monitor.py:363] GPUUtil/0: 15.429
[0313 15:03:26 @monitor.py:363] GPUUtil/1: 24.929
[0313 15:03:26 @monitor.py:363] GPUUtil/2: 26.786
[0313 15:03:26 @monitor.py:363] GPUUtil/3: 63.214
[0313 15:03:26 @monitor.py:363] QueueInput/queue_size: 0.80026
[0313 15:03:26 @monitor.py:363] accuracy: 1
[0313 15:03:26 @monitor.py:363] lr: 0.001
[0313 15:03:26 @monitor.py:363] regularize_loss: 0.090447
[0313 15:03:26 @monitor.py:363] validation_accuracy: 0.99565
[0313 15:03:26 @monitor.py:363] validation_cost: 1.8313
[0313 15:03:26 @group.py:42] Callbacks took 4.208 sec in total. DataParallelInferenceRunner: 4.142sec

I forgot to mention that before the activation quantization I got 91% accuracy and did quantize with a 91% pre-trained model.

Thank you very much for your help
Best Regards

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions