Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Submission for Issue #90 #136

Merged

Conversation

@francescobardi
Copy link
Contributor

commented Jan 4, 2019

#90

@reproducibility-org reproducibility-org changed the title #90 Submission for Issue #90 Jan 4, 2019

@reproducibility-org

This comment has been minimized.

Copy link
Collaborator

commented Feb 16, 2019

Hi, please find below a review submitted by one of the reviewers:

Score: 9
Reviewer 1 comment : This work tries to reproduce Neural PDE Solver on 2D Poisson Equation. Some results in original paper are reproduced. Report and code document are well-written. Hyper-parameters are well-searched and experiments look solid.
Confidence : 3

@reproducibility-org

This comment has been minimized.

Copy link
Collaborator

commented Feb 23, 2019

Hi, please find below a review submitted by one of the reviewers:

Score: 9
Reviewer 3 comment : This work does a great job in introducing the problem and reproducing some of the results of the paper "Learning Neural PDE solvers with convergence guarantees". The introduction and background clearly explains the problem and expands on the problem setting of the original paper. The authors do a good job in laying out the mathematical concepts underlying the desired optimisation problem. The experiments are very robust with proper description of problem parameters used. The accompanying code base is also well documented with an existing iPython notebook that could be helpful to other researchers/practitioners interested in reusing this work.
The authors don't explicitly state if they reused any code from the author repository, so I am assuming they wrote it from scratch. While doing a hyperparameter search on learning rate, the authors don't mention the optimizer in the report. The iPython notebook states the use of SGD. Maybe they could have tried other optimizers, like Adam, to observe how the performance varies with learning rate in that case. It would be great if the authors could add a comment on the observed effect of number of layers on performance. Also, stating the number of parameters in each model used could provide more insight into computational and memory demands of the network.
Overall, it's an excellent report with some of the results of the original paper being reproduced. The authors clearly mention the issues faced in reproducing other experiments of the paper, which is certainly helpful to the authors of the original paper as well as future practitioners. Some comments on improving reproducibility would add to the positives of this work.
Confidence : 4

@reproducibility-org

This comment has been minimized.

Copy link
Collaborator

commented Mar 18, 2019

Hi, please find below a review submitted by one of the reviewers:

Score: 8
Reviewer 2 comment : * Report is largely well written with only minor typos ("unkown", etc).

  • Even though not all the experiments have been reproduced, the ones that have been reproduced have been done very nicely.
  • Issues while reproducing the paper have also been mentioned and the team did communicate with the authors as well (kudos!)
  • The code is readable and should be useful for people working on this problem.

Confidence : 3

@reproducibility-org

This comment has been minimized.

Copy link
Collaborator

commented Mar 31, 2019

Meta Reviewer Decision: Accept

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
4 participants
You can’t perform that action at this time.