Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem downloading the DADA-2000 dataset #1

Open
KC900201 opened this issue Mar 11, 2020 · 15 comments
Open

Problem downloading the DADA-2000 dataset #1

KC900201 opened this issue Mar 11, 2020 · 15 comments

Comments

@KC900201
Copy link

KC900201 commented Mar 11, 2020

Hi, I wish to download the public traffic dataset from this GitHub repository. However, the dataset is stored at Baidu repository shown as below, and Baidu currently doesn't allow member registration outside of PR China. Is there any other anyway for me to download the dataset from other sources?

https://pan.baidu.com/s/1gt0zzd-ofeVeElSlTQbVmw#list/path=%2FDADA-2000%2FHalf%20of%20the%20data&parentPath=%2F

@KC900201
Copy link
Author

KC900201 commented Jun 1, 2020

Hi @JWFangit , any news on the issue?

@YanDingXin
Copy link
Collaborator

Hi @JWFangit , any news on the issue?

Hello, we are glad to get your attention. The ground truth of our data set has been changed.We're working on it, and will be uploaded again in the near future, and will also be uploaded to Google drive. Please be patient. Thank you..

@deepakgopinath
Copy link

Hi, @JWFangit

We were wondering if there is any update on this! We were hoping to download the DADA2000 dataset and facing similar issues as @KC900201.

Is the dataset now available on Google Drive?

Thanks
D

@San-Di
Copy link

San-Di commented Oct 7, 2021

First of all, thank you very much for the good dataset to detect traffic anomalies together with anomaly classification.
However, the dataset is uploaded at Baidu repository, and Baidu currently doesn't allow member registration outside of PR China as @KC900201 mentioned above.
Would you please upload these raw videos through google drive?

Thank you

@KC900201
Copy link
Author

KC900201 commented Oct 7, 2021

@San-Di Any link provided for raw videos?

@KC900201
Copy link
Author

@San-Di can you provide the link of the google drive?

@San-Di
Copy link

San-Di commented Oct 14, 2021

@KC900201 I were just asking the author to provide the raw videos or any link other than Baidu platform. I also don't have raw videos. ^^

@KC900201
Copy link
Author

@JWFangit any updates on this issue?

@Hou-XinTong
Copy link

Hi! I have processed videos and maps according to the author's script, and the process is still smooth. However, when I run main.py, I will find that I still need a folder called "semantic". Where are the images in this folder downloaded from, or what script is needed to generate them?

@JWFangit
Copy link
Owner

Hi. The semantic images in our work are obtained by using Deeplab-v3 model. You can utilize it to generate the semantic image for each RGB frame.
Thanks!

@Hou-XinTong
Copy link

Thank you for your reply! DeepLabV3 has several implementations in different frameworks. Can you please provide the GitHub address used for processing data? And is the model trained on the 80-category Coco dataset?

@JWFangit
Copy link
Owner

JWFangit commented May 4, 2023

Hi. The deeplab-v3 is pre-trained on the Cityscapes dataset.

@Hou-XinTong
Copy link

I have found a highly-rated DeepLabv3 project on GitHub and used its pre-trained model for the Cityscape dataset. However, when generating masks for this project using Semantic, I found that the generated masks had poor results.Could you please provide the dataset link for the missing Semantic folder in this project, or provide the GitHub website for which DeepLabv3 project that you used and the pre-trained model that was used to generate Semantic masks?Thanks a lot!

@JWFangit
Copy link
Owner

JWFangit commented May 18, 2023 via email

@shankargsetty
Copy link

Any updates on providing the dataset on google drive?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants