Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

deploying to replicate #83

Closed
walter-grace opened this issue Aug 4, 2023 · 1 comment · Fixed by #162
Closed

deploying to replicate #83

walter-grace opened this issue Aug 4, 2023 · 1 comment · Fixed by #162
Labels
enhancement New feature or request

Comments

@walter-grace
Copy link

Describe the solution you'd like
I would love to see a model of Gorilla hosted to Replicate, it would be nice to be able to utilize their API and hosting.
Additional context
Had a blast playing with the colab

@walter-grace walter-grace added the enhancement New feature or request label Aug 4, 2023
@ShishirPatil
Copy link
Owner

Community contributions welcome. I will try to get to this next weeked!

ShishirPatil added a commit that referenced this issue Feb 4, 2024
This PR provides a guide for users to set up private inference for
Gorilla models on Replicate. It includes instructions for configuring a
Gorilla model with Cog (Replicate's open source containerization tool),
building and pushing the Docker image, and running inference
programmatically using Replicate's Python client library. The guide
ensures users can easily deploy and interact with Gorilla models in a
private and controlled environment.

This addition is designed to facilitate a smoother workflow for users
seeking to utilize Gorilla models on a privately hosted endpoint via
Replicate, offering an alternative to the public zanino.berkeley.edu
endpoint.

Resolves #83

---------

Co-authored-by: Raman Varma <ramanvarma@Ramans-MacBook-Pro.local>
Co-authored-by: Shishir Patil <30296397+ShishirPatil@users.noreply.github.com>
devanshamin pushed a commit to devanshamin/gorilla that referenced this issue Jul 9, 2024
…Patil#162)

This PR provides a guide for users to set up private inference for
Gorilla models on Replicate. It includes instructions for configuring a
Gorilla model with Cog (Replicate's open source containerization tool),
building and pushing the Docker image, and running inference
programmatically using Replicate's Python client library. The guide
ensures users can easily deploy and interact with Gorilla models in a
private and controlled environment.

This addition is designed to facilitate a smoother workflow for users
seeking to utilize Gorilla models on a privately hosted endpoint via
Replicate, offering an alternative to the public zanino.berkeley.edu
endpoint.

Resolves ShishirPatil#83

---------

Co-authored-by: Raman Varma <ramanvarma@Ramans-MacBook-Pro.local>
Co-authored-by: Shishir Patil <30296397+ShishirPatil@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants