Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to train on my own datasets? #3

Closed
happsky opened this issue Jul 4, 2018 · 2 comments
Closed

How to train on my own datasets? #3

happsky opened this issue Jul 4, 2018 · 2 comments

Comments

@happsky
Copy link

happsky commented Jul 4, 2018

How to train on my own datasets?

@aurooj
Copy link
Owner

aurooj commented Jul 20, 2018

@happsky : Sorry for being late in responding to your query. If you want to train it on your own dataset, you can follow the standard training procedure mentioned at the github repo of RefineNet
You will need to create files gen_classs_info_<dataset-name>.m and my_gen_ds_info_<dataset-name>.m with information about your dataset and train, val, test splits.
If your dataset also has 2 classes: hand and background, then you can use the same files from folder refinenet_files in this repo.

I hope it helps. Let me know if you have any more queries.

@aurooj
Copy link
Owner

aurooj commented Jul 21, 2018

To be more clear, this is what you need to do:

  • Make a clone from refinenet github repo

  • If you want to train on your own dataset, you need to follow the demo training file: demo_refinenet_train.m.

  • You need to provide the information for your dataset and the classes. If you go through the file demo_refinenet_train.m, you will find the dataset information is defined in the file my_gen_ds_info_voc.m, so follow this file to write your own file for defining the dataset, e.g., my_gen_ds_info_<dataset-name>.m

  • There's a line in the file my_gen_ds_info_voc.m:

ds_info.class_info=gen_class_info_voc();

where the class information is specified in the function gen_class_info_voc(), and you need to follow this to write your own function.

  • Since you will need a new classification layer, you need to skip the initiation of the last classification layer from the trained model.

In the file gen_network_main.m to skip this initialization, You will find this line in the file:

loss_group_info=gen_network_loss_group(train_opts, net_config, group_output_info);

add the following lines after the above line to stop the initiation of the classification layer:


% if you want to increase the learning rate for the last classification layer, uncomment this line:

% loss_group_info.net_info.ref.lr_multiplier=10;```

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants