Skip to content

This Streamlit app helps deploy LLMs to your cloud (AWS, GCP, Azure, Lambda Cloud) via user interface and access them for inference.

License

Notifications You must be signed in to change notification settings

dstackai/llm-weaver

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLM Weaver

This Streamlit app helps fine-tune and deploy LLMs using your cloud (AWS, GCP, Azure, Lambda Cloud, TensorDock, Vast.ai, etc.) via user interface and access them for inference.

To run workloads in the cloud, the app uses dstack.

Get started

1. Install requirements

pip install -r requirements.txt

2. Set up the dstack server

If you have default AWS, GCP, or Azure credentials on your machine, the dstack server will pick them up automatically.

Otherwise, you need to manually specify the cloud credentials in ~/.dstack/server/config.yml. For further details, refer to server configuration.

Once clouds are configured, start it:

dstack server

Now you're good to run the app.

3. Run the app

streamlit run Inference.py

About

This Streamlit app helps deploy LLMs to your cloud (AWS, GCP, Azure, Lambda Cloud) via user interface and access them for inference.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages