Skip to content

Latest commit

 

History

History
125 lines (119 loc) · 8.55 KB

reports.org

File metadata and controls

125 lines (119 loc) · 8.55 KB

Training information

Base model

Without pruning.

EpochTime (s)Train LossTest LossTrain AccTest AccSparsity
1108.6535.95827.7790.680.780.0
2111.3124.46724.5780.840.840.0
398.2419.87320.6230.870.870.0
4109.1917.79522.9410.890.870.0
5106.1416.15520.8060.90.870.0
6105.5514.86421.0170.910.870.0
7103.6613.64522.3070.920.870.0
8104.5311.98921.8260.930.860.0

Basic pruning config.

EpochTime (s)Train LossTest LossTrain AccTest AccSparsity
1110.3735.54725.8010.690.830.0
2100.5324.13923.1670.840.810.08
3103.6719.78922.0540.870.840.26
4101.616.80920.5820.890.870.45
5103.7915.05320.9010.910.870.64
6107.5514.25321.7990.910.870.81
7112.4714.27624.0240.920.870.94
8104.811.65322.5260.930.840.94
9100.4411.07224.9690.930.860.94
10113.3511.09322.7740.930.850.94

Pruning with higher frequency. Also increased q.

EpochTime (s)Train LossTest LossTrain AccTest AccSparsity
196.1934.99523.9460.70.850.0
291.8223.64520.6180.850.870.12
397.1418.79921.4720.880.870.35
487.2316.19921.3340.90.870.58
592.7714.89820.0330.910.860.8
689.5813.0422.0260.920.850.94
790.1913.0519.6170.920.870.98
894.6312.5721.1210.930.870.98
994.0212.21520.4670.930.880.98
1094.8211.60936.1940.930.80.98

Pruning with high ramp_mult (20). All weights are pruned at the end of third epoch

EpochTime (s)Train LossTest LossTrain AccTest AccSparsity
191.2735.41123.2840.690.840.04
296.9426.75229.4100.790.800.74
395.5636.67644.1360.660.501.00
487.6644.07843.9750.500.501.00
591.7144.29344.3190.500.501.00

Other

Python function used to parse output of learner to org.

To parse output paste it in this cell

---------------------------------------------------------------------------------------------------------------
| end of epoch   1 | time: 110.37s | train/valid loss 35.547/25.801 | train/valid acc 0.69/0.83 | sparsity 0.00
---------------------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------------
| end of epoch   2 | time: 100.53s | train/valid loss 24.139/23.167 | train/valid acc 0.84/0.81 | sparsity 0.08
---------------------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------------
| end of epoch   3 | time: 103.67s | train/valid loss 19.789/22.054 | train/valid acc 0.87/0.84 | sparsity 0.26
---------------------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------------
| end of epoch   4 | time: 101.60s | train/valid loss 16.809/20.582 | train/valid acc 0.89/0.87 | sparsity 0.45
---------------------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------------
| end of epoch   5 | time: 103.79s | train/valid loss 15.053/20.901 | train/valid acc 0.91/0.87 | sparsity 0.64
---------------------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------------
| end of epoch   6 | time: 107.55s | train/valid loss 14.253/21.799 | train/valid acc 0.91/0.87 | sparsity 0.81
---------------------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------------
| end of epoch   7 | time: 112.47s | train/valid loss 14.276/24.024 | train/valid acc 0.92/0.87 | sparsity 0.94
---------------------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------------
| end of epoch   8 | time: 104.80s | train/valid loss 11.653/22.526 | train/valid acc 0.93/0.84 | sparsity 0.94
---------------------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------------
| end of epoch   9 | time: 100.44s | train/valid loss 11.072/24.969 | train/valid acc 0.93/0.86 | sparsity 0.94
---------------------------------------------------------------------------------------------------------------
---------------------------------------------------------------------------------------------------------------
| end of epoch  10 | time: 113.35s | train/valid loss 11.093/22.774 | train/valid acc 0.93/0.85 | sparsity 0.94
---------------------------------------------------------------------------------------------------------------

After that run block with org-babel-execute-src-block

from parse import parse
in_fmt = '| end of epoch {:3d} | time: {:5.2f}s ' \
         '| train/valid loss {:05.3f}/{:05.3f} ' \
         '| train/valid acc {:04.3f}/{:04.3f} | sparsity {:.2f}'

lines = list(filter(lambda line: '-'*111 not in line,  s.strip().split('\n')))
lines = list(map(lambda line: line.strip(), lines))
out_fmt = '| {} | {} | {} | {} | {} | {} | {} |\n'
res     = '| Epoch | Time (s) | Train Loss | Test Loss | Train Acc | Test Acc | Sparsity |\n' \
          '|-------+----------+------------+-----------+-----------+----------+----------|\n'
for line in list(lines):
    res += out_fmt.format(*parse(in_fmt, line))
return res

Paste this to the file and press TAB to allign