Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Review #3 (review by editor) #159

Closed
colah opened this issue Apr 27, 2018 · 0 comments
Closed

Review #3 (review by editor) #159

colah opened this issue Apr 27, 2018 · 0 comments

Comments

@colah
Copy link
Member

colah commented Apr 27, 2018

The third review for this article for the Distill review process may be delayed. To facilitate timely peer review, one of Distill's editors, Chris Olah, has written a review. Chris will continue shepherding the paper, but a different Distill editor will make the final decision on article publication.

Conflicts of Interest: Chris knows some of the co-authors but doesn't have the kind of significant relationship that he would see as giving rise to a substantive conflict of interest. Chris may have some biases from working with the article as an editor.

This review was written using a working draft of the Distill Reviewer Worksheet. We do not expect present ratings to necessarily be comparable to ratings under the final version.


Advancing the Dialogue

(Distill articles only need succeed at one of the following)

Does the article contain interesting novel results? [1-5, n/a]
n/a - This article isn't substantially about novel results.

Does the article provide a different way of thinking about a topic? [1-5, n/a]
3 - Many researchers would take away a new way of thinking, but nothing transformative.

Does the article provide a different or significantly better explanation of something? [1-5, n/a]
4

Is this article worth drawing attention to? [1-5]
4 - For the right audience, this would be significantly worth reading.

Comments
This article changed how I think about conditioning neural networks. Previously, I’d only concatenated on conditioning information. After reading this article, FILM layers seem like the natural way to go about things and will likely be my default in the future. The article did a good job giving demonstrations of how FILM can be used, and how it relates to other approaches like concatenation.

Scientific Correctness & Integrity

Are claims in the article well supported? To the extent relevant, are experiments in the article well designed, and interpreted fairly? [1-5]
4 - Claims are well supported, either by results in the paper or cited work.

How easy would it be to replicate (or falsify) the results? [1-5]
2 - The article is well explained, but doesn’t do anything special to aid reproduction.

Does the article cite relevant work? [1-5]
4 - Article does a good job of reviewing related literature.

Considered as a whole, does the article exhibit strong intellectual honesty and scientific hygiene? [1-5]
3 - Article exhibits normal and reasonable levels of scientific integrity, but hasn't done anything going beyond typical norms of the field.

Comments
Most of the article reviews previous work, and so this section is less relevant. That said, it is applicable to the section “Properties of FiLM-ed networks” and it seems like that could be better done:

  • The results presented for interpolation are essentially anecdotes. Could the authors allow users to change the values being interpolated between, or add an appendix with more examples?
  • I'm not sure exactly what I should take away from the t-SNE's, and without the ability to inspect individual points yet it is difficult to interrogate.

More generally, it might be nice to talk about any cases where the authors have observed FILM not working.

Outstanding Communication

Basic expectations: Does the article make its topic and scope clear? Is the article well-organized? Does it avoid rambling?
4 - Article is well organized, clear, and on point.

Does the article follow good design practices? Do diagrams comply with the Distill style guide, or have reasons not to?
4 - Article makes effective use of well-designed diagrams.

Does the article provide useful new “tools for thinking” about the topic? (eg. visual way of thinking, new abstraction, better notation, analogy, etc)
2

Does the article include interfaces that surfaces qualitatively new insights?
1 - The article doesn’t provide significant contributions of this kind.

How readable is the paper, accounting for the difficulty of the topic? [1-5]
4 - Similar in exposition quality to the CTC article.

Comments:
The FILM framework is a nice way to think about conditioning neural networks. The authors provide examples of visually presenting FILM in computational graph diagrams, and use this medium to explain how it relates to other techniques.

The FILM framework is conceptually helpful and the authors experiment with how to visually present it, but for the most part this article is effective use of relatively standard visual tools for thinking about neural network architecture, rather than novel abstractions and interfaces.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants