Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reward #26

Open
mratsim opened this issue Oct 6, 2018 · 2 comments
Open

Reward #26

mratsim opened this issue Oct 6, 2018 · 2 comments

Comments

@mratsim
Copy link

mratsim commented Oct 6, 2018

Reproducibility is a huge pain point to tackle and kudos for spearheading this initiative.

However implementing potentially sparsely documented papers with potentially implementation "details" that matter is a very time-consuming endeavour. And this is not counting producing several test cases, documenting the code, writing the paper.

I understand that your budget is probably quite limited but at the very least, inspectors should have the opportunity to attend ICLR (or maybe other staple conferences).

A "reproducer" workshop might also send a strong signal and incentive to the academic community that reproducibility is not just a nice-to-have but a must-have.

@anirudh9119
Copy link

anirudh9119 commented Oct 9, 2018

Hello,

@mratsim Its an interesting question. I along with some of the organizers of this reproducibility challenge organized the first reproducibility workshop. The aim of starting the workshop was two fold.

  1. Give credit or award (by having a workshop at NIPS/ICML/ICLR) to authors who verify or fails to verify the results of the paper. This could have interesting implications, as someone who tries to verify can come up with even better results (by using random ``tricks'' or by architectural changes) or if it fails to verify, it could be a crucial negative result (for ex. a method XYZ does not work in ABC settings).

  2. At that time, I also felt that it is essential to revisit oldish baselines, as combining ideas from the paper's which are already published along with new regularizer's or tricks (or may be new theory) can be helpful. And hence, the idea behind organizing the workshop.

  3. It could be notoriously hard to reproduce some of the papers. Like Jörg Bornschein from MILA reproduced (https://github.com/jbornschein/draw) DRAW, and it was considered very hard to reproduce DRAW at that time. So people who attempt should get some reward, in form of workshop publication which could possibly be cited in the future.

I'm not sure about ICLR, but I'm pretty sure, someone would keep on organizing Reproducibility workshop at ICML and NIPS, and that should allow you to atleast attend both these conferences.

I've no involvement in this reproducibility challenge, so I'm speaking for myself.

Thanks for your hard work! 👍

@koustuvsinha
Copy link
Member

Hi @mratsim,

We have partnered with ReScience to publish selected reproducibility efforts in a journal publication. That way inspectors can have their efforts published with valid DOI. We will be announcing the integration and review process soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants