Skip to content

Commit

Permalink
Update readme (#213)
Browse files Browse the repository at this point in the history
* Describe *why* Substratus was built
* Clear links to next-step docs
* Add no-BS quickstart
  • Loading branch information
nstogner committed Aug 23, 2023
1 parent 8361996 commit a2af101
Show file tree
Hide file tree
Showing 3 changed files with 109 additions and 39 deletions.
1 change: 1 addition & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -227,6 +227,7 @@ docs: crd-ref-docs embedmd
--renderer=markdown
# TODO: Embed YAML examples into the generate API documentation.
# $(EMBEDMD) -w ./docs/api/generated.md
$(EMBEDMD) -w ./docs/*.md

# PLATFORMS defines the target platforms for the manager image be build to provide support to multiple
# architectures. (i.e. make docker-buildx IMG=myregistry/mypoperator:0.0.1). To use this option you need to:
Expand Down
145 changes: 107 additions & 38 deletions docs/README.md
Original file line number Diff line number Diff line change
@@ -1,49 +1,118 @@
# Substratus

Deploy and fine-tune large language models on Kubernetes.

[Substratus](https://www.substratus.ai) is a cross-cloud substrate for training
Substratus is a cross-cloud substrate for training
and serving ML models. It extends the Kubernetes control plane to orchestrate ML
operations through the addition of custom resource definitions: Model, Server,
operations through the addition of new custom resources: Model, Server,
Dataset, and Notebook.

---

Features:

* Deploy state-of-the-art ML models in minutes. We ❤️ open source LLMs.
* Fine-tune without writing code.
* Built-in serving layer compatible with OpenAI APIs.
* Run Notebooks to experiment. Ship an identical serving environment to production.

Our one-minute demo shows the basics:

[![Watch the video](https://img.youtube.com/vi/CLyXKJHIQ6A/hq2.jpg)](https://youtu.be/CLyXKJHIQ6A)

Learn more at [substratus.ai](https://www.substratus.ai).

## Documentation

What problems can Substratus solve? Our
[introduction docs](https://www.substratus.ai/docs) break it down.
We created Substratus because we believe:

Want to get started fast? Our
[quickstart](https://www.substratus.ai/docs/quickstart) will have you stand up a
model in minutes.
* Installing an ML platform should take minutes not weeks.
* Running state of the art LLMs should be single-command-simple.
* Finetuning on your own data should work out of the box.
* Simplicity should not exclude flexibility - all ML code should be customizable through a seamless Notebook experience.

## Collaborators are welcome
Learn more on the website:

See the [development docs](../docs/development.md) to get started. Our
[docs directory](../docs/) has all the details.
* [Intro Post](https://www.substratus.ai/blog/introducing-substratus)
* [Overview](https://www.substratus.ai/docs/overview)
* [Architecture](https://www.substratus.ai/docs/architecture)

## Architecture
See what it is about in less than 2 minutes:

Want to learn more about how Substratus works? Our
[overview](https://www.substratus.ai/docs/overview) page is a
great place to dive in. We've got diagrams to spare!

---

Disclaimer: Substratus is under rapid development. We also value stability.
Hang tight!
[![Watch the video](https://img.youtube.com/vi/CLyXKJHIQ6A/hq2.jpg)](https://youtu.be/CLyXKJHIQ6A)

## Try it out!

Create a local Kubernetes cluster using Kind.

[embedmd]:# (../install/kind/up.sh bash /kind.*/ $)
```bash
kind create cluster --name substratus --config - <<EOF
apiVersion: kind.x-k8s.io/v1alpha4
kind: Cluster
nodes:
- role: control-plane
extraPortMappings:
- containerPort: 30080
hostPort: 30080
EOF
```

Install Substratus.

```bash
kubectl apply -f https://raw.githubusercontent.com/substratusai/substratus/main/install/kind/manifests.yaml
```

Import a small Open Source LLM.

```bash
kubectl apply -f https://raw.githubusercontent.com/substratusai/substratus/main/examples/facebook-opt-125m/base-model.yaml
```

[embedmd]:# (../examples/facebook-opt-125m/base-model.yaml yaml)
```yaml
apiVersion: substratus.ai/v1
kind: Model
metadata:
namespace: default
name: facebook-opt-125m
spec:
image: substratusai/model-loader-huggingface
params:
name: facebook/opt-125m
```

Serve the LLM.

```bash
kubectl apply -f https://raw.githubusercontent.com/substratusai/substratus/main/examples/facebook-opt-125m/base-server.yaml
```

[embedmd]:# (../examples/facebook-opt-125m/base-server.yaml yaml)
```yaml
apiVersion: substratus.ai/v1
kind: Server
metadata:
name: facebook-opt-125m
spec:
image: substratusai/model-server-basaran
model:
name: facebook-opt-125m
```

Checkout the progress of the Model and the Server.

```bash
kubectl get ai
```

When they report a `Ready` status, start a port-forward.

```bash
kubectl port-forward service/facebook-opt-125m-server 8080:8080
```

Open your browser to [http://localhost:8080/](http://localhost:8080/) or curl the LLM's API.

*PS: Because of the small size of this particular LLM, expect comically bad answers to your prompts.*

```bash
curl http://localhost:8080/v1/completions \
-H "Content-Type: application/json" \
-d '{ \
"model": "facebook-opt-125m", \
"prompt": "Who was the first president of the United States? ", \
"max_tokens": 10\
}'
```

Delete the local cluster.

[embedmd]:# (../install/kind/down.sh bash /kind.*/ $)
```bash
kind delete cluster --name substratus
```

If you want to try out a more capable LLM, running on substantial hardware, try deploying Substratus in the cloud via the [GCP quickstart guide](https://www.substratus.ai/docs/quickstart/gcp).
2 changes: 1 addition & 1 deletion docs/kubectl-applybuild-design.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,4 +11,4 @@ kubectl applybuild -f ./notebook.yaml .
kubectl applybuild -f ./dataset.yaml .
kubectl applybuild -f ./model.yaml .
kubectl applybuild -f ./server.yaml .
```
```

0 comments on commit a2af101

Please sign in to comment.