Skip to content
This repository has been archived by the owner on Mar 29, 2024. It is now read-only.

Initial draft - How to Deploy your Petals Private Swarm #69

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 1 addition & 1 deletion blog/2023-06-26-welcome/index.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
slug: hello-prem
title: Hello Prem!
authors: [filippopedrazzinfp]
authors: [filopedraz]
tags: [llm, self-hosted, prem, open-source, welcome]
description: "Hello, I am Filippo and I am currently contributing to Prem."
image: "./banner.png"
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
slug: serving-falcon-7b-fastapi-docker
title: Serving Falcon 7B Instruct with FastAPI and Docker
authors: [filippopedrazzinfp]
authors: [filopedraz]
tags: [llm, self-hosted, prem, open-source, fastapi, docker, falcon-7b]
description: "In this tutorial, we will walk you through the process of serving the Falcon 7B Instruction model using FastAPI and Docker. The complete code for this tutorial is available on GitHub."
image: "./banner.jpg"
Expand Down
2 changes: 1 addition & 1 deletion blog/2023-07-05-chainlit-langchain-qa/index.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
slug: chainlit-langchain-prem
title: Talk to your Data with ChainLit and Langchain
authors: [ tiero, filippopedrazzinfp ]
authors: [ tiero, filopedraz ]
tags: [ llm, self-hosted, prem, open-source, langchain, chainlit, vicuna-7b, chroma, vector-store ]
description: 'Build a chatbot that talks to your data with Prem using LangChain, Chainlit, Chroma Vector Store and Vicuna 7B model, self-hosted on your MacOS laptop.'
image: './chainlit-langchain.gif'
Expand Down
72 changes: 72 additions & 0 deletions blog/2023-09-23-petals-private-swarm/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
---
slug: petals-private-swarm
title: How to Deploy your Petals Private Swarm
authors: [filopedraz]
tags: [llms, petals, swarm, private, prem]
description: "Get your own Petals Private Swarm up and running in 5 minutes"
---

<!--truncate-->

## Introduction to Petals

### What is Petals?

### Some Definitions and Concepts

- **DHT**: Distributed Hash Table
- **Torrent**: A torrent is a file sent via the BitTorrent protocol. It can be just about any type of file, such as a movie, song, game, or application.
- **Swarm**: A swarm is a group of peers that share a torrent and are both uploading and downloading the torrent's content.
- **Peer**: A peer is one instance of a BitTorrent client running on a computer on the Internet to which other clients connect and transfer data.
- **Tracker**: A tracker is a server that keeps track of which seeds and peers are in the swarm.
- **Seed**: A seed is a client that has a complete copy of the data of a certain torrent. Once your BitTorrent client finishes downloading, it will remain open until you click the Finish button (or otherwise close it). This is known as being a seed or seeding.

### How does Petals work?

## Deploy your Petals Private Swarm

### 1. Run a Backbone Server

Create a DO Machine with at least 2GiB of RAM and 1 CPU and Install all the necessary dependencies (Docker).

```bash
docker run -d --net host --ipc host --volume petals-cache-backbone:/cache --name backbone --rm learningathome/petals:main python -m petals.cli.run_dht --host_maddrs /ip4/0.0.0.0/tcp/31337 --identity_path bootstrap1.id
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why we do have --rm here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can remove

```

Check the logs of the `backbone` containers in order to get the IPs.

### 2. Contribute to the Swarm

#### Cloud GPU Instance (NVIDIA)

Create a Machine on `Paperspace` or `DataCrunch` and run the following command in order to join the Swarm

```bash
docker run -d --net host --ipc host --gpus all --volume petals-cache-node1:/cache --name node1 --rm learningathome/petals:main python -m petals.cli.run_server --port 31330 --num_blocks 20 petals-team/StableBeluga2 --initial_peers /ip4/209.38.217.30/tcp/31337/p2p/QmecL18cmRaDdAcRmA7Ctj1gyAeUYG433WppA1UWTHTew6 /ip4/127.0.0.1/tcp/31337/p2p/QmecL18cmRaDdAcRmA7Ctj1gyAeUYG433WppA1UWTHTew6
```

Where the `--initial_peers` variable should be filled with the logs you get from the `backbone` peer.

#### Mac

```bash
brew install python
python3 -m pip install git+https://github.com/bigscience-workshop/petals
python3 -m petals.cli.run_server --public_name prem-app petals-team/StableBeluga2 --initial_peers /ip4/209.38.217.30/tcp/31337/p2p/QmecL18cmRaDdAcRmA7Ctj1gyAeUYG433WppA1UWTHTew6 /ip4/127.0.0.1/tcp/31337/p2p/QmecL18cmRaDdAcRmA7Ctj1gyAeUYG433WppA1UWTHTew6
```

#### Prem

- Install Prem Desktop App
- Go to `Settings`
- Enable `Swarm Mode`

### 3. Monitor your Private Swarm

- Run Prem Explorer

### 4. Consume your Private Swarm

- Run Prem App and connect to your Private Swarm

## Conclusion
2 changes: 1 addition & 1 deletion blog/authors.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ tiero:
title: Bitcoin wizard
url: https://github.com/tiero
image_url: https://github.com/tiero.png
filippopedrazzinfp:
filopedraz:
name: Filippo Pedrazzini
title: Core contributor @ PremAI
url: https://github.com/filopedraz
Expand Down