Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reproducing Results in Table 1 #30

Closed
alasdairtran opened this issue Mar 3, 2021 · 4 comments
Closed

Reproducing Results in Table 1 #30

alasdairtran opened this issue Mar 3, 2021 · 4 comments

Comments

@alasdairtran
Copy link

Hi Zongyi,

I really enjoyed reading your paper! I was having a go at reproducing your numbers in Table 1. I'm wondering if the numbers on the last row come from the test_l2 variable of the final epoch?

If I clone your repo and run python fourier_1d.py, then I have test_l2 as 0.00254360 on epoch 500. In Table 1 of your paper, you report a value of 0.0160 for the FNO model with s = 1024. This seems to be an order of magnitude bigger. Am I looking at the right metric?

@zongyi-li
Copy link
Collaborator

Hi Alasdair, in the paper, we used the burgers_R10.mat dataset with modes=16, width=64, and epoch=500. If you get better results probably you are using burgers_v1000.mat dataset, which has smoother initial conditions and therefore easier to model.

@alasdairtran
Copy link
Author

No I was using burgers_data_R10.mat from your google drive. I didn't change any code in fourier_1d.py.

Upon closer inspection, I was able to get a test_l2 closer to the number in Table 1 when I reverted back a few commits. I'm guessing that the one-order-of-magnitude improvement is due to the commit where you remove the batch norm. It's interesting that batch norm makes the performance a lot worse here.

@zongyi-li
Copy link
Collaborator

That's interesting. We observed the Fourier neural network usually doesn't require the batchnorm. But it's kind of surprising it gets 1 order of magnitude improvement. Thanks for letting me know.

@zongyi-li
Copy link
Collaborator

I confirm I get test_l2 = 0.00286 too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants