Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

You must reference the code you copied #1

Closed
ching-sui1995 opened this issue Feb 10, 2021 · 6 comments
Closed

You must reference the code you copied #1

ching-sui1995 opened this issue Feb 10, 2021 · 6 comments

Comments

@ching-sui1995
Copy link

First, most of your code is taken from(https://github.com/jeonsworld/ViT-pytorch) which is owned by @jeonsworld.
Second, the entire idea of your paper is taken from (https://arxiv.org/abs/2012.15840). In your paper, you have not mentioned that they are the first to propose this architecture and your work is derived from them.

This is very unprofessional. I hope the famous people in your paper already knows about your conduct.

@yuyinzhou
Copy link
Collaborator

Thanks for pointing out this issue.

  1. About the missing code reference. We have added the code reference from (https://github.com/jeonsworld/ViT-pytorch).
  2. About the concurrent Arxiv. Our proposed TransUNet is inspired by ViT and U-Net (before the release date of https://arxiv.org/abs/2012.15840) in that ViT is powerful at extracting global contexts whereas U-Net is powerful at segmenting finer details. And we also want to note that one of our key points is the skip connection used in TransUNet (Sec. 4, Fig. 2), which makes our architecture quite different from this Arxiv reference. Lastly, we would like to highlight that we have already cited this Arxiv reference in our paper.

P.S. Please check http://cvpr2020.thecvf.com/submission/main-conference/author-guidelines for the definition of concurrent submission in case you are not aware of this. I wish you could educate yourself before posting comments next time.

@ching-sui1995
Copy link
Author

I am glad that my comment here forced you to add the source repository from which you have got the code. So I believe you should educate yourself about adding references to public code you publish ( not me as you suggested ). Again, most of your code is from this repository.

You have cited the paper I mentioned but never mentioned that it proposed a similar architecture before you did, nor you compared against it. Your paper is not considered as concurrent submission to a paper posted months before your paper and submitted to a different conference. You should really admit your mistake, take the high road and post a revised version to ArXiv. You should discuss these matters I brought with your professor and seek advise before submitting this to any actual conference.

@yuyinzhou
Copy link
Collaborator

I think your 2nd suggestion is invalid and truly unprofessional. Please feel free to bring this matter to my advisor or anyone you like.

@ching-sui1995
Copy link
Author

Suggesting to acknowledging prior papers is never unprofessional. Whether I bring it up or not, the community can see and understand your contribution and the way you treated that paper. Enough said here.

@segtran
Copy link

segtran commented Feb 11, 2021

@ching-sui1995 As a third-party who has done a similar architecture (transformer for segmentation), I'd like to participate in this discussion. My main points:

  1. First of all, you can find the repo of my own model (https://github.com/segtran/segtran). It's done in last July-Aug and had won a place in a competition. (This is not my main github account. As my paper is under review, I have to stay anonymous.)
  2. I believe quite a few teams were doing similar explorations concurrently last year, so I won't claim my model was the first, or earlier than TransUNet.
  3. All our works, no matter SETR, TransUNet or Segtran, have their own merits, and should join forces to contribute to the "transformer for segmentation" paradigm. I don't think it helps the community if we fight each other for who was the first that proposed the idea. If you keep an open mindset, I think you are able to find contributions in each work that were absent in the other similar works.

@ydzhang12345
Copy link

Although this work is similar to SETR, the authors do present the results without skip connection (which is essentially SETR?) and show that the U-shape connections is important. @ching-sui1995

@segtran Well said and good luck to your paper review.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants