You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Because in Table 2, FNO1d still has only 1e-2 evaluation relative error for the Burgers' benchmark.
In the new implementation, if BatchNorm is removed, and change the ReLU activation to GeLU or SiLU, 500 epochs of MultiStepLR scheduling will yield about 2.5e-3 to 2.7e-3 evaluation relative error: see neuraloperator/neuraloperator#30 , even with a 1cycleLR + 100 epochs, the evaluation error is around 4.1e-3.
The text was updated successfully, but these errors were encountered:
Hi, thanks for the sharing the nicely written code on the wavelet neural operator.
I wonder what version of FNO is used in the comparison? for example in Table 2 of https://arxiv.org/abs/2109.13459v1
Because in Table 2, FNO1d still has only
1e-2
evaluation relative error for the Burgers' benchmark.In the new implementation, if
BatchNorm
is removed, and change the ReLU activation to GeLU or SiLU, 500 epochs of MultiStepLR scheduling will yield about2.5e-3
to2.7e-3
evaluation relative error: see neuraloperator/neuraloperator#30 , even with a 1cycleLR + 100 epochs, the evaluation error is around4.1e-3
.The text was updated successfully, but these errors were encountered: