Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any way of porting this to Colab? #12

Open
datheia opened this issue Aug 6, 2022 · 10 comments
Open

Any way of porting this to Colab? #12

datheia opened this issue Aug 6, 2022 · 10 comments

Comments

@datheia
Copy link

datheia commented Aug 6, 2022

Most of us don't have GPUs powerful enough to even run models with 6 billion parameters. Can we port this to Colab in any way so it would be more accessible?

@moyix
Copy link
Collaborator

moyix commented Aug 6, 2022

This is a really interesting idea! Do you know if Google Colab has any way to listen on a network port that can be reached from the outside world?

@datheia
Copy link
Author

datheia commented Aug 6, 2022

I remember a storyteller AI (KoboldAI, absolute banger). It opens an HTTP server from Cloudflare, then you connect to that Cloudflare server from your browser and you could have interacted with it very well. Sending a prompt request, getting a response from the server. Maybe something like that can be done in this case?

@moyix
Copy link
Collaborator

moyix commented Aug 6, 2022

I also just found this, which looks like it might be a good fit since the FauxPilot server is already using Flask: https://www.geeksforgeeks.org/how-to-run-flask-app-on-google-colab/

I will look into putting together a notebook! :) Thanks for the great suggestion!

@deven367
Copy link

deven367 commented Aug 7, 2022

I think Colab has started to ban tunneling. Colab FAQ

image

I used to use a similar tool called colabcode which would allow firing up VSCode in colab on a remote server, but with recent changes in their policy, they don't allow this anymore.

You can check more about this over here. abhishekkrthakur/colabcode#109

@datheia
Copy link
Author

datheia commented Aug 7, 2022

Well, isn't this project a server for code generation? Does it launch up VSCode? You might have been banned because of the third rule. Again, about the tunneling, the AI application at Colab I mentioned above (KoboldAI) is still up and running, even though they use tunneling. Here's the link for their Colab.

@enniosousa
Copy link

How about serverless?
Cloud Run on Google Cloud Platform you pay for each HTTP request and first 2 millions request are free.

@moyix
Copy link
Collaborator

moyix commented Aug 8, 2022

Hmm would serverless work when the models are really big though? Loading the 16B model from disk -> GPU takes almost a minute, so I wouldn't want to have to do that on every completion request...

@xloem
Copy link

xloem commented Aug 12, 2022

Not quite on topic but: if the model layers were broken up people could form small networks to share compute.

@moyix
Copy link
Collaborator

moyix commented Aug 13, 2022

I think the network latency involved would make that pretty slow?

@Genuifx
Copy link

Genuifx commented Mar 17, 2023

emm, any progress on this issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants