You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thank you very much for your great work!
What should I do if I want to train the coco dataset, after all the Megadepth dataset is too big!
I think first use superpoint to export the coco dataset detections to generate a series of npz files, then execute the command python3 -m dump.dump_megadepth --feature_type spp --base_path path_of_megadepth --save_ path your_save_path, is this correct?
Looking forward to your reply!
The text was updated successfully, but these errors were encountered:
For training on coco dataset, a better way is to generate paired images online with perspective projection function in opencv. I have tried the training on coco only, but the performance is not very good.
I will try to provide the training scripts on coco.
For training on coco dataset, a better way is to generate paired images online with perspective projection function in opencv. I have tried the training on coco only, but the performance is not very good.
I will try to provide the training scripts on coco.
Thank you very much for your reply.
Since my own training dataset is similar to coco, I need your training code for training coco. Looking forward to your training scripts!
Hi, thank you very much for your great work!
What should I do if I want to train the coco dataset, after all the Megadepth dataset is too big!
I think first use superpoint to export the coco dataset detections to generate a series of npz files, then execute the command python3 -m dump.dump_megadepth --feature_type spp --base_path path_of_megadepth --save_ path your_save_path, is this correct?
Looking forward to your reply!
The text was updated successfully, but these errors were encountered: