You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @elnino9ykl,
thank you for sharing the code. I find it very valuable.
I try to run the script main.py under train to train the model.
However, I don't know how to prepare the dataset folder for datadir argument.
When I read the code and the paper, I guess that you use a subset of 27 classes out of 66 classes in Vista dataset.
However, the code only supports Cityscapes dataset as written in cityscapes class in piwise/dataset.py.
Could you please show me the proper way to prepare the dataset folder for datadir argument?
If possible, could you please share the script to convert Mapillary Vista to your desire format (folder structure, select 27 classes only, ...)?
Thank you.
The text was updated successfully, but these errors were encountered:
Hi,
thank you very much for your interest in our work. We use the 27 classes as shown in Table I of the paper. I preprocessed the Vistas dataset to fit the Cityscapes format. You can also implement this easily. Another way is to use other segmentation training frameworks like mmsegmentation or torchseg, and integrate our model and our model adaptation method.
Hi @elnino9ykl,
thank you for sharing the code. I find it very valuable.
I try to run the script
main.py
undertrain
to train the model.However, I don't know how to prepare the dataset folder for
datadir
argument.When I read the code and the paper, I guess that you use a subset of 27 classes out of 66 classes in Vista dataset.
However, the code only supports Cityscapes dataset as written in
cityscapes
class inpiwise/dataset.py
.Could you please show me the proper way to prepare the dataset folder for
datadir
argument?If possible, could you please share the script to convert Mapillary Vista to your desire format (folder structure, select 27 classes only, ...)?
Thank you.
The text was updated successfully, but these errors were encountered: