-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Possible release date #1
Comments
Hi, thanks for your interest. We are preparing for the open source and will release both the code and models. Please stay tuned.:) |
Good work, looking forward to the code and pre-trained models. |
Hi,thanks for your great work! when I used the DINOv2 as a backbone, it would give a 1024 embedding as a ouput.But you've written that it will give a tensor with the size of hwc. Could you tell me why? |
Hi, you can call DinoVisionTransformer.forward_features to get a dict which contains class token and patch tokens. You can refer https://github.com/facebookresearch/dinov2/blob/main/dinov2/models/vision_transformer.py#LL233C22-L233C22 to get more details. :) |
加油! |
thanks for the great article |
Great work. Looking forward to the release. |
Hello, could you provide the codes and pre-trained models for testing? I want to test using my own data sets. |
For those waiting for the release, check out PerSAM (personalized SAM) which is already open-source, I made 2 tutorial notebooks for it here: https://github.com/NielsRogge/Transformers-Tutorials/tree/master/PerSAM The Matcher authors do compare against PerSAM in their paper |
Any update? I'm also looking forward to the release! |
Please tell me there's a relevant announcement date now, really looking forward to it! |
When will it be updated? |
When are you publishing the code? |
The code has been released. Thank you for your attention! |
Hi, thanks for the great article
May I ask you what time are you going to release the source code? Will you release the pretrained models too?
The text was updated successfully, but these errors were encountered: