Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possible release date #1

Closed
pourfard opened this issue May 27, 2023 · 14 comments
Closed

Possible release date #1

pourfard opened this issue May 27, 2023 · 14 comments

Comments

@pourfard
Copy link

Hi, thanks for the great article
May I ask you what time are you going to release the source code? Will you release the pretrained models too?

@lihengtao
Copy link
Collaborator

Hi, thanks for your interest. We are preparing for the open source and will release both the code and models. Please stay tuned.:)

@tensorboy
Copy link

Good work, looking forward to the code and pre-trained models.

@ha1ha2hahaha
Copy link

Hi,thanks for your great work! when I used the DINOv2 as a backbone, it would give a 1024 embedding as a ouput.But you've written that it will give a tensor with the size of hwc. Could you tell me why?

@lihengtao
Copy link
Collaborator

Hi,thanks for your great work! when I used the DINOv2 as a backbone, it would give a 1024 embedding as a ouput.But you've written that it will give a tensor with the size of h_w_c. Could you tell me why?

Hi, you can call DinoVisionTransformer.forward_features to get a dict which contains class token and patch tokens. You can refer https://github.com/facebookresearch/dinov2/blob/main/dinov2/models/vision_transformer.py#LL233C22-L233C22 to get more details. :)

@leolle
Copy link

leolle commented Jun 28, 2023

Hi, thanks for your interest. We are preparing for the open source and will release both the code and models. Please stay tuned.:)

加油!

@tieguanyin803
Copy link

thanks for the great article

@evolu8
Copy link

evolu8 commented Jul 18, 2023

Great work. Looking forward to the release.

@Jingxi245
Copy link

Hello, could you provide the codes and pre-trained models for testing? I want to test using my own data sets.

@NielsRogge
Copy link

For those waiting for the release, check out PerSAM (personalized SAM) which is already open-source, I made 2 tutorial notebooks for it here: https://github.com/NielsRogge/Transformers-Tutorials/tree/master/PerSAM

The Matcher authors do compare against PerSAM in their paper

@SSUHan
Copy link

SSUHan commented Sep 14, 2023

Any update? I'm also looking forward to the release!

@zzzyzh
Copy link

zzzyzh commented Sep 28, 2023

Please tell me there's a relevant announcement date now, really looking forward to it!

@feivelliu
Copy link

When will it be updated?

@simoneangarano
Copy link

When are you publishing the code?

@yangliu96
Copy link
Collaborator

The code has been released. Thank you for your attention!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests