Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Verify the correctness of SSD loss. #8626

Closed
qingqing01 opened this issue Feb 28, 2018 · 2 comments
Closed

Verify the correctness of SSD loss. #8626

qingqing01 opened this issue Feb 28, 2018 · 2 comments
Assignees

Comments

@qingqing01
Copy link
Contributor

qingqing01 commented Feb 28, 2018

  • Target: this verification is to make the loss consistent with Caffe.

  • Method: in caffe, we train the SSD model on VOC dataset by following the tutorial , and save the inputs and outputs of one mini-batch in multibox_loss_layer of caffe, then load them in Fluid testing code.

  • The saving code in caffe aftern line https://github.com/weiliu89/caffe/blob/ssd/src/caffe/layers/multibox_loss_layer.cpp#L151:

    std::ofstream s1("loc_data", std::ios_base::binary);
    s1.write(reinterpret_cast<const char*>(loc_data), bottom[0]->count() * sizeof(Dtype));
    
    std::ofstream s2("conf_data", std::ios_base::binary);
    s2.write(reinterpret_cast<const char*>(conf_data), bottom[1]->count() * sizeof(Dtype));
    
    std::ofstream s3("prior_data", std::ios_base::binary);
    s3.write(reinterpret_cast<const char*>(prior_data), bottom[2]->count() * sizeof(Dtype));
    
    std::ofstream s4("gt_data", std::ios_base::binary);
    s4.write(reinterpret_cast<const char*>(gt_data), bottom[3]->count() * sizeof(Dtype));
    
  • Testing code in Fluid:

import paddle.v2 as paddle
import paddle.fluid as fluid
import numpy as np

def load(file_name, shape):
    with open(file_name, 'rb') as f:
        return np.fromfile(f, dtype=np.float32).reshape(shape)

def load_test_data(loc_shape, conf_shape, prior_shape, gt_shape):
    loc = load('data/loc_data', loc_shape)
    conf = load('data/conf_data', conf_shape)
    prior = load('data/prior_data', prior_shape)
    pb = prior[0, :]
    pbv = prior[1, :]
    gt = load('data/gt_data', gt_shape)
    gb = gt[:, 3:7]
    gc = gt[:, 1].reshape((-1, 1))
    giterm = gt[:, 0].astype(int).tolist()

    batch_size = max(giterm) + 1 # iterm index is from 0
    lod = [0 for i in range(batch_size + 1)]
    for i in range(batch_size):
        lod[i + 1] = lod[i] + giterm.count(i)
    return loc, conf, pb, pbv, gb, gc, [lod]

def loss_check():
    p_num = 8732
    num_class = 21

    loc = fluid.layers.data(name='loc', shape=[p_num, 4], dtype='float32')
    conf = fluid.layers.data(name='conf', shape=[p_num, num_class], dtype='float32')
    prior = fluid.layers.data(name='prior', shape=[p_num, 4], dtype='float32', append_batch_size=False)
    prior_var = fluid.layers.data(name='prior_var', shape=[p_num, 4], dtype='float32', append_batch_size=False)
    gt_box = fluid.layers.data(name='gt_box', shape=[4], dtype='float32', lod_level=1)
    gt_lbl = fluid.layers.data(name='gt_lbl', shape=[1], dtype='float32', lod_level=1)

    loss = fluid.layers.ssd_loss(loc, conf, gt_box, gt_lbl, prior, prior_var)

    place = fluid.CPUPlace()
    exe = fluid.Executor(place)

    loc_shape = [-1, p_num, 4]
    conf_shape = [-1, p_num, num_class]
    proir_shape = [2, p_num, 4]
    gt_shape = [-1, 8]

    loc_t, conf_t, pb_t, pbv_t, gb_arr, glbl_arr, lod = load_test_data(loc_shape, conf_shape, proir_shape, gt_shape)

    gb_t = fluid.core.LoDTensor()
    gb_t.set(gb_arr, place)
    gb_t.set_lod(lod)

    gc_t = fluid.core.LoDTensor()
    gc_t.set(glbl_arr, place)
    gc_t.set_lod(lod)

    feeding = {'loc': loc_t, 'conf': conf_t, 'prior': pb_t,
                'prior_var': pbv_t, 'gt_box': gb_t, 'gt_lbl': gc_t}
    #print fluid.default_main_program()
    loss_v, = exe.run(fluid.default_main_program(), feed=feeding, fetch_list=[loss])
    print loss_v
    print loss_v.shape
    print np.sum(loss_v)


if __name__ == '__main__':
    loss_check()
  • Results of forward loss:
    • loss of one mini-batch data: Fluid 23.2972 vs Caffe 23.297
    • loss of another mini-batch data: Fluid 23.7343 vs Caffe 23.7343

So the forward loss is correct.

@qingqing01
Copy link
Contributor Author

qingqing01 commented Mar 1, 2018

Verify confidence gradients

The confidence gradients of one data:

  • Fluid
[[-0.1992908   0.00825488  0.01199047  0.02423528  0.00792938  0.01677721
   0.0028048   0.01369748  0.00847711  0.01919442  0.00641916  0.01908394
   0.01902221  0.00596175  0.00827814  0.00810263  0.00686489  0.00445934
   0.00225886  0.00388005  0.00159883]
 [-0.19936702  0.00756692  0.01117872  0.02741443  0.00587094  0.01449234
   0.0024837   0.01470402  0.00686268  0.02048922  0.00632338  0.01824134
   0.02078596  0.00526546  0.00974535  0.00872548  0.00725919  0.0030125
   0.00265777  0.00438818  0.00189944]
 [-0.199402    0.00690327  0.0095233   0.02869447  0.00505326  0.01337513
   0.0018478   0.01715952  0.00636242  0.01928199  0.00655488  0.01366473
   0.0244522   0.00491364  0.01110354  0.01029228  0.00790294  0.00273219
   0.0030584   0.004533    0.00199303]
 [-0.19943415  0.00610243  0.00767753  0.02684752  0.00506072  0.01241578
   0.00155259  0.02009527  0.00727706  0.01799602  0.00766025  0.01125705
   0.02763908  0.00424804  0.01112364  0.01092169  0.00824735  0.00282025
   0.00338689  0.00482514  0.00227987]
 [-0.19939244  0.00525302  0.00751871  0.02435748  0.00544498  0.01207561
   0.00143729  0.02028093  0.00809389  0.01757362  0.00805904  0.01132297
   0.03260975  0.00409669  0.00959663  0.00962749  0.00858057  0.00290539
   0.00318636  0.00483612  0.0025359 ]
 [-0.19933611  0.00516152  0.00762941  0.02302878  0.00596994  0.0122163
   0.00146134  0.01935527  0.00794396  0.01777591  0.00818626  0.01177212
   0.03660865  0.00401642  0.00868012  0.00810076  0.00837608  0.00288199
   0.00287898  0.00450199  0.00279032]
 [-0.19929935  0.00519787  0.00784324  0.0230044   0.00622441  0.01245881
   0.00146952  0.01891684  0.00765285  0.0175505   0.00809368  0.0124937
   0.03777711  0.00407055  0.00830298  0.00737403  0.00782559  0.00286741
   0.00270789  0.00443341  0.00303452]
 [-0.19929564  0.00532724  0.00795782  0.02347768  0.0061983   0.01265476
   0.00147549  0.01912792  0.00729363  0.0173388   0.00801109  0.0128324
   0.03754841  0.00410715  0.00837757  0.00708322  0.00726826  0.00289627
   0.00264057  0.00450241  0.00317668]
 [-0.19927242  0.00540276  0.00856238  0.02413696  0.00600794  0.01217696
   0.0014877   0.01951584  0.00735367  0.01729929  0.00789836  0.01225255
   0.03620198  0.00414334  0.00900283  0.00716     0.0070185   0.00287078
   0.00272461  0.0047539   0.00330208]
 [-0.19923526  0.00544352  0.00927655  0.02582101  0.00585211  0.01084526
   0.00151108  0.01891122  0.0075052   0.01759712  0.00804772  0.0112243
   0.03415913  0.00429092  0.00960609  0.00747203  0.0069708   0.00287795
   0.00308282  0.0051694   0.00357103]
 [-0.19919473  0.00541169  0.00974631  0.0265266   0.00594862  0.00942236
   0.00155741  0.01683494  0.00746294  0.01742782  0.00841299  0.01055974
   0.03485496  0.0046041   0.0098611   0.00741054  0.00712574  0.00287287
   0.00353691  0.00574258  0.00387457]
 [-0.19928625  0.00358208  0.00160952  0.05565556  0.01505674  0.01315142
   0.01418038  0.00511527  0.00668512  0.009781    0.00947566  0.00180506
   0.00458663  0.00366373  0.0036167   0.00339534  0.02373105  0.00408939
   0.0042305   0.00178832  0.01408677]
 [ 0.00416961  0.02556739  0.01101466  0.00236778  0.03420679  0.01345179
   0.0026261   0.01159472  0.0148463   0.00360806  0.00664276  0.02344195
   0.00845314  0.00245682  0.00500347 -0.19275236  0.00248153  0.00842555
   0.00382931  0.00631989  0.00224477]
 [ 0.00431013  0.02052785  0.01043427  0.0083735   0.01533755  0.00684404
   0.02301331  0.00962037  0.01041971  0.00348646  0.0051127   0.01200862
   0.00102795  0.00844576  0.00440515 -0.19062281  0.00402603  0.01113473
   0.00540809  0.02294033  0.00374628]
 [ 0.00533081  0.01016065  0.00583632  0.00048624  0.01010419  0.01106227
   0.00585401  0.0111981   0.00231263  0.01981713  0.00371024  0.00488221
   0.01497615  0.02408479  0.00878555 -0.19231762  0.01852817  0.0205886
   0.0042982   0.00455432  0.00574703]
 [ 0.01540889  0.00444972  0.03487318  0.01419673  0.00847425  0.00649205
   0.00986262  0.00244522  0.01663786  0.00217686  0.00602963  0.00551804
   0.01144056  0.00531416  0.00432371 -0.18628879  0.00158936  0.01096136
   0.01112011  0.01064433  0.00433014]
 [ 0.01047893  0.00744325 -0.19561918  0.01051297  0.01198426  0.03411746
   0.01252159  0.00415636  0.00778598  0.00660572  0.02644507  0.00354053
   0.01215073  0.00223955  0.00105366  0.01705031  0.00573522  0.00795422
   0.00607285  0.00279486  0.00497567]
 [-0.19933657  0.00361837  0.01242569  0.01997736  0.00815224  0.01651757
   0.01378537  0.00365795  0.00452557  0.00688551  0.01378152  0.0050786
   0.00307833  0.01135987  0.00830146  0.00439357  0.00194808  0.04724045
   0.00947554  0.00377414  0.00135941]
 [-0.19935669  0.00230722  0.00132983  0.01541838  0.03718892  0.00631958
   0.01397749  0.02670378  0.0036823   0.01225351  0.00263461  0.01171905
   0.01448554  0.00894655  0.00105401  0.00352136  0.02572424  0.00106055
   0.00418605  0.00423709  0.00260659]
 [-0.19929852  0.00686423  0.00146887  0.01969476  0.0762256   0.00248293
   0.0154927   0.01709834  0.00830558  0.00593895  0.0034443   0.00286663
   0.00692003  0.00518335  0.00117128  0.00449472  0.00449408  0.00029757
   0.01287119  0.0022777   0.0017057 ]]

  • Caffe:
[[-0.1992908   0.00825488  0.01199047  0.02423528  0.00792938  0.0167772
   0.0028048   0.01369748  0.00847711  0.01919442  0.00641916  0.01908395
   0.01902221  0.00596175  0.00827814  0.00810263  0.00686489  0.00445934
   0.00225886  0.00388005  0.00159883]
 [-0.19936702  0.00756692  0.01117872  0.02741443  0.00587094  0.01449234
   0.0024837   0.01470402  0.00686268  0.02048922  0.00632338  0.01824134
   0.02078596  0.00526546  0.00974535  0.00872548  0.00725919  0.0030125
   0.00265777  0.00438818  0.00189944]
 [-0.199402    0.00690327  0.0095233   0.02869447  0.00505326  0.01337512
   0.0018478   0.01715952  0.00636242  0.01928199  0.00655488  0.01366473
   0.0244522   0.00491364  0.01110354  0.01029228  0.00790294  0.00273219
   0.0030584   0.004533    0.00199303]
 [-0.19943415  0.00610243  0.00767752  0.02684752  0.00506072  0.01241578
   0.00155259  0.02009527  0.00727706  0.01799602  0.00766025  0.01125705
   0.02763908  0.00424804  0.01112364  0.01092168  0.00824735  0.00282025
   0.00338689  0.00482514  0.00227987]
 [-0.19939244  0.00525302  0.00751871  0.02435748  0.00544498  0.01207561
   0.00143729  0.02028093  0.00809389  0.01757362  0.00805904  0.01132297
   0.03260975  0.00409669  0.00959663  0.00962749  0.00858057  0.00290539
   0.00318636  0.00483612  0.0025359 ]
 [-0.19933611  0.00516152  0.00762941  0.02302878  0.00596994  0.0122163
   0.00146134  0.01935527  0.00794396  0.01777591  0.00818626  0.01177212
   0.03660865  0.00401642  0.00868012  0.00810076  0.00837608  0.00288199
   0.00287898  0.00450199  0.00279032]
 [-0.19929934  0.00519787  0.00784324  0.0230044   0.00622441  0.01245881
   0.00146952  0.01891684  0.00765285  0.0175505   0.00809368  0.0124937
   0.03777711  0.00407055  0.00830298  0.00737403  0.0078256   0.00286741
   0.00270789  0.00443341  0.00303452]
 [-0.19929563  0.00532724  0.00795782  0.02347768  0.0061983   0.01265476
   0.00147549  0.01912792  0.00729363  0.0173388   0.00801109  0.01283239
   0.03754841  0.00410715  0.00837757  0.00708322  0.00726826  0.00289627
   0.00264057  0.00450241  0.00317668]
 [-0.19927242  0.00540275  0.00856238  0.02413696  0.00600794  0.01217696
   0.0014877   0.01951584  0.00735367  0.01729929  0.00789836  0.01225255
   0.03620198  0.00414334  0.00900283  0.00716     0.0070185   0.00287078
   0.00272461  0.0047539   0.00330208]
 [-0.19923526  0.00544352  0.00927655  0.02582101  0.00585211  0.01084526
   0.00151108  0.01891122  0.0075052   0.01759712  0.00804772  0.0112243
   0.03415914  0.00429092  0.00960609  0.00747204  0.0069708   0.00287795
   0.00308282  0.0051694   0.00357103]
 [-0.19919473  0.00541169  0.00974631  0.02652659  0.00594862  0.00942236
   0.00155741  0.01683494  0.00746294  0.01742782  0.00841299  0.01055974
   0.03485496  0.0046041   0.0098611   0.00741054  0.00712574  0.00287287
   0.00353691  0.00574258  0.00387457]
 [-0.19928625  0.00358208  0.00160952  0.05565556  0.01505675  0.01315142
   0.01418039  0.00511527  0.00668512  0.009781    0.00947566  0.00180506
   0.00458663  0.00366373  0.0036167   0.00339534  0.02373105  0.00408939
   0.0042305   0.00178832  0.01408677]
 [ 0.00416961  0.02556739  0.01101466  0.00236778  0.03420679  0.01345179
   0.0026261   0.01159472  0.0148463   0.00360806  0.00664276  0.02344195
   0.00845314  0.00245682  0.00500347 -0.19275236  0.00248153  0.00842555
   0.00382931  0.00631989  0.00224477]
 [ 0.00431013  0.02052785  0.01043427  0.0083735   0.01533754  0.00684404
   0.02301331  0.00962037  0.01041971  0.00348646  0.0051127   0.01200862
   0.00102795  0.00844576  0.00440515 -0.19062281  0.00402603  0.01113473
   0.00540809  0.02294033  0.00374628]
 [ 0.00533081  0.01016065  0.00583632  0.00048624  0.01010419  0.01106227
   0.00585401  0.0111981   0.00231263  0.01981713  0.00371024  0.00488221
   0.01497615  0.02408479  0.00878555 -0.19231762  0.01852817  0.02058859
   0.0042982   0.00455432  0.00574703]
 [ 0.01540889  0.00444972  0.03487318  0.01419673  0.00847425  0.00649205
   0.00986262  0.00244522  0.01663786  0.00217686  0.00602963  0.00551804
   0.01144056  0.00531416  0.00432371 -0.18628879  0.00158936  0.01096136
   0.01112011  0.01064433  0.00433014]
 [ 0.01047893  0.00744325 -0.19561918  0.01051297  0.01198425  0.03411746
   0.01252159  0.00415636  0.00778598  0.00660572  0.02644507  0.00354053
   0.01215073  0.00223955  0.00105366  0.01705031  0.00573522  0.00795422
   0.00607285  0.00279486  0.00497567]
 [-0.19933657  0.00361837  0.01242569  0.01997735  0.00815224  0.01651757
   0.01378537  0.00365795  0.00452557  0.0068855   0.01378151  0.0050786
   0.00307833  0.01135987  0.00830146  0.00439357  0.00194808  0.04724045
   0.00947554  0.00377414  0.00135941]
 [-0.19935669  0.00230723  0.00132983  0.01541838  0.03718893  0.00631958
   0.01397749  0.02670378  0.0036823   0.01225351  0.00263461  0.01171906
   0.01448554  0.00894655  0.00105401  0.00352136  0.02572425  0.00106055
   0.00418605  0.00423709  0.00260659]
 [-0.1992985   0.00686423  0.00146887  0.01969476  0.0762256   0.00248293
   0.0154927   0.01709834  0.00830558  0.00593895  0.0034443   0.00286663
   0.00692003  0.00518335  0.00117128  0.00449472  0.00449408  0.00029757
   0.01287119  0.0022777   0.0017057 ]]
  • Conclusion:

The absolute error: about 0.00000x%
So there is no problem with the confidence gradients.

  • The main Fluid code:
    # omit the other code
    loc_grad_var = fluid.framework.get_var('reshape_6.tmp_0@GRAD')
    conf_grad_var = fluid.framework.get_var('reshape_1.tmp_0@GRAD')
    conf_loss_grad_var = fluid.framework.get_var('softmax_with_cross_entropy_1.tmp_1@GRAD')
    fetch = [loss, loc_grad_var, conf_grad_var, conf_loss_grad_var]
    loss_v, loc_grad_v, conf_grad_v, conf_loss_grad_v = exe.run(fluid.default_main_program(), feed=feeding, fetch_list=fetch)

    conf_loss_grad_v = np.array(conf_loss_grad_v).astype(np.float32).flatten()
    indices = np.where(conf_loss_grad_v > 0.)[0]

    print('-------- fluid gradients ------')
    print 'conf gradients', conf_grad_v[indices, :]

    print('-------- caffe gradient ------')
    expect_conf_g = load('data/conf_diff_data', conf_grad_v.shape)
    print 'conf gradient', expect_conf_g[indices, :]

@qingqing01
Copy link
Contributor Author

Verify localization gradients

The localization gradients of one data:

  • Fluid:
[[-0.2        -0.2         0.2         0.2       ]
 [-0.2        -0.2         0.2         0.2       ]
 [-0.03149858 -0.2         0.2         0.10940187]
 [-0.2        -0.06359603  0.2         0.2       ]
 [-0.2         0.2         0.2         0.2       ]]
  • Caffe:
[[-0.2        -0.2         0.2         0.2       ]
 [-0.2        -0.2         0.2         0.2       ]
 [-0.03149858 -0.2         0.2         0.10940187]
 [-0.2        -0.06359603  0.2         0.2       ]
 [-0.2         0.2         0.2         0.2       ]]
  • The main Fluid code:
    loc_grad_var = fluid.framework.get_var('reshape_6.tmp_0@GRAD')
    conf_grad_var = fluid.framework.get_var('reshape_1.tmp_0@GRAD')
    conf_loss_grad_var = fluid.framework.get_var('softmax_with_cross_entropy_1.tmp_1@GRAD')
    loc_loss_grad_var = fluid.framework.get_var('smooth_l1_loss_0.tmp_1@GRAD')

    fetch = [loss, loc_grad_var, conf_grad_var, conf_loss_grad_var, loc_loss_grad_var]
    loss_v, loc_grad_v, conf_grad_v, conf_loss_grad_v, loc_loss_grad_v = exe.run(fluid.default_main_program(), feed=feeding, fetch_list=fetch)

    loc_loss_grad_v = np.array(loc_loss_grad_v).astype(np.float32).flatten()
    loc_indices = np.where(loc_loss_grad_v > 0.)[0]
    print loc_indices

    print('-------- fluid gradients ------')
    print 'loc gradients sum', np.sum(loc_grad_v[loc_indices, :])
    f1 = open("fluid_loc.txt", "w")
    print >> f1, 'loc gradients', loc_grad_v[loc_indices, :]
    f1.close()

    print('-------- caffe gradient ------')
    expect_loc_g = load('data/loc_diff_data', loc_grad_v.shape)
    print 'loc gradient sum', np.sum(expect_loc_g[loc_indices, :])
    f2 = open("caffe_loc.txt", "w")
    print >> f2, 'loc gradient', expect_loc_g[loc_indices, :]
    f2.close()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Development

No branches or pull requests

1 participant