Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Key Error occurred when running scripts of Hpathces #17

Closed
gsygsy96 opened this issue Jun 4, 2018 · 10 comments
Closed

Key Error occurred when running scripts of Hpathces #17

gsygsy96 opened this issue Jun 4, 2018 · 10 comments

Comments

@gsygsy96
Copy link

gsygsy96 commented Jun 4, 2018

When I run train_hardnet_on_HPatches_per_split.sh, key error occurred. It seems that something wrong in data generation?

@ducha-aiki
Copy link
Collaborator

Could you please dump here exact error message?

@gsygsy96
Copy link
Author

gsygsy96 commented Jun 4, 2018

./code/HardNetHPatchesSplits.py:426: UserWarning: nn.init.orthogonal is now deprecated in favor of nn.init.orthogonal_.
nn.init.orthogonal(m.weight.data, gain=0.6)
['/media/xxx/DATA/guan/hardnet-master/code/../data/sets/hpatches_splits/hpatches_split_a_train.pt']
Generating 15000000 triplets
0%| | 0/15000000 [00:00<?, ?it/s]
Traceback (most recent call last):
File "./code/HardNetHPatchesSplits.py", line 708, in
train_loader, test_loaders = create_loaders(load_random_triplets=triplet_flag)
File "./code/HardNetHPatchesSplits.py", line 469, in create_loaders
transform=transform_train),
File "./code/HardNetHPatchesSplits.py", line 208, in init
self.triplets = self.generate_triplets(self.labels, self.n_triplets, self.batch_size)
File "./code/HardNetHPatchesSplits.py", line 237, in generate_triplets
if len(indices[c1]) == 2: # hack to speed up process
KeyError: 68268

seems error in dict?

@gsygsy96
Copy link
Author

gsygsy96 commented Jun 4, 2018

And could you tell me
1.what means ags.batch_reduce? L2Net/random_global?
2.and args.decor and args.gor?

I know args.decor and args.gor is for L2Net, but what means batch_reduce? and anchor_swap in loss_HardNet? By the way, learning rate in code is really high...

@DagnyT
Copy link
Owner

DagnyT commented Jun 4, 2018

Hi!
About HardNetHPatchesSplits - can you please share more details how did you generate hpatches dataset files and did you use HPatchesDatasetCreator and it finished successfully?

About your questions, batch_reduce - variable that contains various types of sampling strategies. The initial setup is 'min', which means that for each anchor we gather the smallest negative across batch. 'random' means that we take random value across batch as negative.
'random_global' means that we have pre-generated per dataset, not per batch anchors, positives and negatives that are going to loss.
'anchor_swap' - for each anchor we choose smallest negative, for it's positive we choose smallest negative and among them we choose the smallest negative distance.
args.decor - penalty for the correlation of the descriptor dimensions, that refers to CorrelationPenaltyLoss class.
args.gor - usage of global orthogonal regularization, for more details you can check https://arxiv.org/pdf/1708.06320.pdf

@gsygsy96
Copy link
Author

gsygsy96 commented Jun 5, 2018

First, thanks for your help! Understood a lot!
And to confirm I have understand you, I will translate your answer in my word. Could review my point?

  1. batch_reduce is various types of sample (anchor,negative) pairs.
    -- min: choose the smallest data as anchor's negative
    -- random : random chose negative
    --random_global: choose generate pairs in the whole dataset
  2. anchor_swap: Sorry, I don't know what you mean. You mean choose smallest negative as anchor? So, what differences between anchor_swap and 'min'?

@gsygsy96
Copy link
Author

gsygsy96 commented Jun 5, 2018

And for hpathces, I have run HPathcesDatasetCreator correctly. And how much RAM you used when you run HPathces?

@ducha-aiki
Copy link
Collaborator

@shanYanGuan anchor swap procedure is described in detail in this paper: http://www.bmva.org/bmvc/2016/papers/paper119/paper119.pdf

@gsygsy96
Copy link
Author

gsygsy96 commented Jun 5, 2018

Thanks! And how big RAM your computer have?

@ducha-aiki
Copy link
Collaborator

16 Gb

@saisusmitha
Copy link

Hi, can you tell me the sequence of the codes we have to run, as there are many codes. even I got the same error :::
./code/HardNetHPatchesSplits.py:426: UserWarning: nn.init.orthogonal is now deprecated in favor of nn.init.orthogonal_.
nn.init.orthogonal(m.weight.data, gain=0.6)
So tell me how to solve it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants