-
Notifications
You must be signed in to change notification settings - Fork 76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Loading Salicon dataset #9
Comments
Hi @spandanagella, You can download the ground-truth density maps and fixation maps from this page: http://salicon.net/challenge-2017/. If you want to replicate our results, you have to use the original release of the SALICON dataset. |
Hi @marcellacornia,
|
Yes, @marcellacornia, I checked all datasets from this page, including previous release of SALICON (Matlab files and saliency maps, used in ’15 and ’16 challenges). Data structure is the same as I described before, and it's not correspond with the data processing in the code. |
Please try by changing the
|
Hi @Mastya, Were you able to train the models with the above preprocessing code? Spandana |
Hi, @spandanagella, |
Hi @marcellacornia, |
Hi @SenJia, The results in Table IV were obtained by using the output of the Attentive ConvLSTM at different timesteps as input for the rest of the model. The results are on the SALICON validation set, using the 2015 version of the dataset. In Table IV, the pre-computed saliency maps we released were used to compute the results with T=4. |
@prachees Me too, the loss start from a nan number. It looks like |
Hi,
I'm trying to retrain the models on salicaon using your code. I don't see the fixations and fixation maps data as used in the code here. Like separate files for each image. Or is there a preprocessing part that I'm supposed to do before using the code?
Can you give me pointers to the dataset download url?
I'm currently downloading the dataset from
http://salicon.net/download/
Thank you so much!
Spandana
The text was updated successfully, but these errors were encountered: