Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow different models in replicate.ai interface #17

Merged
merged 6 commits into from Sep 23, 2021
Merged

Conversation

mehdidc
Copy link
Owner

@mehdidc mehdidc commented Sep 22, 2021

@cjwbw Thanks again for providing an interface to the model in replicate.ai. I would like now to allow the user to select between different models. I modified predict.py and download-weights.sh accordingly.

I would like to update the image on https://replicate.ai/mehdidc/feed_forward_vqgan_clip/ , is cog push r8.im/mehdidc/feed_forward_vqgan_clip the correct way to do it ? or it should be done on your side ? I tried the command but I got "docker: Error response from daemon: could not select device driver "" with capabilities: [[gpu]]." since I don't have an nvidia GPU on my local machine (assuming that's the reason it failed).

@chenxwh
Copy link

chenxwh commented Sep 22, 2021

Hi @mehdidc,

Yes, that is the way, but you do need to push from a GPU machine when in cog.yaml file it specifies gpu: true.
Also, it is recommended to load all the models in setup() if possible, because it runs setup() the first time, and the consecutive runs are just calling predict().

I could adapt the code and push from my side?

@mehdidc
Copy link
Owner Author

mehdidc commented Sep 22, 2021

Hey @cjwbw, thanks for your answer. I did load all the models on setup and put them in nets but maybe I am missing something else, yes please adapt the code and push from your side then, thanks a lot!

@chenxwh
Copy link

chenxwh commented Sep 22, 2021

Hi @mehdidc, yes just looked closely that models are loaded in setup() actually you were right. Just tested with the new models, "cc12m_32x1024_vitgan_v0.1.th" and "cc12m_32x1024_vitgan_v0.2.th" works, but "cc12m_32x1024_mlp_mixer_v0.2.th" will complain AttributeError: 'Rearrange' object has no attribute '_recipe'

@mehdidc
Copy link
Owner Author

mehdidc commented Sep 22, 2021

Super cool, thanks! cc12m_32x1024_mlp_mixer_v0.2.th, it's cause by einops newest version (0.3.2), I had to use an older one (0.3.0) in the requirements. I will change cog.yaml now to use einops 0.3.2.

EDIT: Ok done, it should work now

@chenxwh
Copy link

chenxwh commented Sep 22, 2021

I have pushed the model to the server now :) just need to change line 35 in predict.py to
def predict(self, prompt, model=DEFAULT_MODEL):

@mehdidc
Copy link
Owner Author

mehdidc commented Sep 22, 2021

Cool :) Oh yes indeed, done for predict

@mehdidc
Copy link
Owner Author

mehdidc commented Sep 23, 2021

Thanks a lot @cjwbw, everything works fine, merging.

@mehdidc mehdidc merged commit e344d35 into master Sep 23, 2021
@bfirsh
Copy link

bfirsh commented Sep 23, 2021

@mehdidc These new models are so cool. :D

We haven't added support to change examples yet, but we can do it manually in the database for you if you'd like. Would you like to change the example that's displayed by default on the form? Maybe one using v0.2?

@mehdidc
Copy link
Owner Author

mehdidc commented Sep 23, 2021

@bfirsh Glad tout you like the new models :) thanks for all the support, the service/web interface makes it so easy to test the models.
Actually yes I was wondering about that, yes thanks please change the default to v0.2 vitgan with the prompt "At my feet the white-petalled daisies display the small suns of their center piece".

@bfirsh
Copy link

bfirsh commented Sep 23, 2021

Done! Let me know if there are any you want to delete/reorder too. We're working on a user interface... sorry about this... 😅

@mehdidc
Copy link
Owner Author

mehdidc commented Sep 23, 2021

Very nice thanks! sure will let you know :)

@mehdidc
Copy link
Owner Author

mehdidc commented Sep 24, 2021

Hi @bfirsh, asking the following because someone was interested, is it possible to access the models using an API ? I mentioned the possibility to use Docker + any HTTP client such as curl as already mentioned in the page, but that will of course launch the web service locally. You might also want to answer directly if you would like, as it relates to replicate.ai in general

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants