Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluation data from table 1 #81

Open
BloodLemonS opened this issue Jan 17, 2024 · 5 comments
Open

Evaluation data from table 1 #81

BloodLemonS opened this issue Jan 17, 2024 · 5 comments

Comments

@BloodLemonS
Copy link

How to obtain evaluation data for table 1? The values obtained when testing the DAVIS data of the release model using the provided evaluation.py file differ significantly from those in the paper.

@Paper99
Copy link
Collaborator

Paper99 commented Feb 13, 2024

Just follow our instructions to perform the evaluation.
You may need to check the installed requirements.

@BloodLemonS
Copy link
Author

I have checked the installation requirements and found that installing different versions of third-party libraries due to differences in device hardware configuration may result in significant calculation errors?

@Paper99
Copy link
Collaborator

Paper99 commented Feb 23, 2024

We have tested on both 3090 and V100, and the metrics are consistent. Could you please tell me whether you used our mask for evaluation?

@BloodLemonS
Copy link
Author

Yes, we conducted the test on the 3070 graphics card using 50 provided test datasets. Except for one dataset with an inf test result, the average PSNR of the other datasets was around 20

@Paper99
Copy link
Collaborator

Paper99 commented Feb 23, 2024

This is very weird. Please carefully check whether the environment configuration is the same as ours. The difference in GPU will not lead to different results.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants