You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the feature
I have successfully trained a mask rcnn network by using mm detection on a relatively small custom dataset. I was wondering whether it is possible to use a custom dataset that is very large without having to write the annotations in a json file, since json in general isn't made for handling such large files.
Motivation
Ex1. It is inconvenient when I need to serialize a large amount of my custom annotations to json. Many times I get out of memory errors when trying to serialize these annotations to a json file (coco format).
Ex2. Json is not made for such large datasets, therefore the resulting annotations can take up a lot of space.
The text was updated successfully, but these errors were encountered:
Thanks for your kind suggestion. For now, we use JSON for convenience. If you want to use other formats, you may consider re-implement the logic of the dataset. We will consider the efficiency issue in the future.
Describe the feature
I have successfully trained a mask rcnn network by using mm detection on a relatively small custom dataset. I was wondering whether it is possible to use a custom dataset that is very large without having to write the annotations in a json file, since json in general isn't made for handling such large files.
Motivation
Ex1. It is inconvenient when I need to serialize a large amount of my custom annotations to json. Many times I get out of memory errors when trying to serialize these annotations to a json file (coco format).
Ex2. Json is not made for such large datasets, therefore the resulting annotations can take up a lot of space.
The text was updated successfully, but these errors were encountered: