Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

erasuer-tokens for other dataset #22

Open
jsw6872 opened this issue Mar 7, 2024 · 2 comments
Open

erasuer-tokens for other dataset #22

jsw6872 opened this issue Mar 7, 2024 · 2 comments

Comments

@jsw6872
Copy link

jsw6872 commented Mar 7, 2024

Thank you for impressed studies.

It's same question on title.
Is there any erasure-tokens for any other dataset? (ex. flowers102, imagenet, coco, caltech101)
or how to get erasure-tokens weight for custom dataset ?

@brandontrabucco
Copy link
Owner

Hello jsw6872,

Thanks for your interest in our work! Yes, I'll upload and share these in a google drive link shortly.

There are: (1) SD checkpoints fine-tuned to erase concepts, and (2) tokens specific to these SD checkpoints for DA-Fusion.

I will upload and share both for the missing datasets.

@jsw6872
Copy link
Author

jsw6872 commented Mar 20, 2024

Hello jsw6872,

Thanks for your interest in our work! Yes, I'll upload and share these in a google drive link shortly.

There are: (1) SD checkpoints fine-tuned to erase concepts, and (2) tokens specific to these SD checkpoints for DA-Fusion.

I will upload and share both for the missing datasets.

Thank you for reply :)
If you'll excuse me, I have one more question for you.

I see that the embed token is required when generating synthetic images with textual inversion, how to generate a embeding .pt file for a specific dataset? Is it right fine_tune.py? I want to make flowers101 dataset embedding tokens..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants