Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update quickstart-pytorch example to use app model #3116

Merged
merged 11 commits into from Apr 2, 2024
40 changes: 40 additions & 0 deletions examples/quickstart-pytorch/README.md
Expand Up @@ -47,6 +47,8 @@ Write the command below in your terminal to install the dependencies according t
pip install -r requirements.txt
```

______________________________________________________________________

Comment on lines +50 to +51
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm thinking we could remove these, since the immediately below section title will introduce one already.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure.

## Run Federated Learning with PyTorch and Flower

Afterwards you are ready to start the Flower server as well as the clients. You can simply start the server in a terminal as follows:
Expand All @@ -72,3 +74,41 @@ python3 client.py --partition-id 1
```

You will see that PyTorch is starting a federated training. Look at the [code](https://github.com/adap/flower/tree/main/examples/quickstart-pytorch) for a detailed explanation.

______________________________________________________________________
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove? (also those in #3117?)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure. I don't have a strong opinion about this


## Run Federated Learning with PyTorch and `Flower Next`

### 1. Start the long-running Flower server (SuperLink)

```bash
flower-superlink --insecure
```

### 2. Start the long-running Flower clients (SuperNodes)

Start 2 long-running Flower clients in 2 separate terminal windows, using:

```bash
flower-client-app client:app --insecure
```

### 3. Run the Flower App

With both the long-running server (SuperLink) and two clients (SuperNode) up and running, we can now run the actual Flower App:

```bash
flower-server-app server:app --insecure
```

Or, to try the workflow example, run:

```bash
flower-server-app server_workflow:app --insecure
```

Or, to try the custom server function example, run:

```bash
flower-server-app server_custom:app --insecure
```
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was only relevant for the exampes/app-pytorch

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah! I see.

26 changes: 20 additions & 6 deletions examples/quickstart-pytorch/client.py
Expand Up @@ -2,7 +2,7 @@
import warnings
from collections import OrderedDict

import flwr as fl
from flwr.client import NumPyClient, ClientApp
from flwr_datasets import FederatedDataset
import torch
import torch.nn as nn
Expand Down Expand Up @@ -111,7 +111,7 @@ def apply_transforms(batch):


# Define Flower client
class FlowerClient(fl.client.NumPyClient):
class FlowerClient(NumPyClient):
def get_parameters(self, config):
return [val.cpu().numpy() for _, val in net.state_dict().items()]

Expand All @@ -131,8 +131,22 @@ def evaluate(self, parameters, config):
return loss, len(testloader.dataset), {"accuracy": accuracy}


# Start Flower client
fl.client.start_client(
server_address="127.0.0.1:8080",
client=FlowerClient().to_client(),
def client_fn(cid: str):
"""Create and return an instance of Flower `Client`."""
return FlowerClient().to_client()


# Flower ClientApp
app = ClientApp(
client_fn=client_fn,
)


# Legacy mode
if __name__ == "__main__":
from flwr.client import start_client

start_client(
server_address="127.0.0.1:8080",
client=FlowerClient().to_client(),
)
2 changes: 1 addition & 1 deletion examples/quickstart-pytorch/pyproject.toml
Expand Up @@ -10,7 +10,7 @@ authors = ["The Flower Authors <hello@flower.ai>"]

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
flwr = ">=1.0,<2.0"
flwr = ">=1.8.0,<2.0"
flwr-datasets = { extras = ["vision"], version = ">=0.0.2,<1.0.0" }
torch = "2.1.1"
torchvision = "0.16.1"
Expand Down
2 changes: 1 addition & 1 deletion examples/quickstart-pytorch/requirements.txt
@@ -1,4 +1,4 @@
flwr>=1.0, <2.0
flwr>=1.8.0, <2.0
flwr-datasets[vision]>=0.0.2, <1.0.0
torch==2.1.1
torchvision==0.16.1
Expand Down
28 changes: 22 additions & 6 deletions examples/quickstart-pytorch/server.py
@@ -1,6 +1,7 @@
from typing import List, Tuple

import flwr as fl
from flwr.server import ServerApp, ServerConfig
from flwr.server.strategy import FedAvg
from flwr.common import Metrics


Expand All @@ -15,11 +16,26 @@ def weighted_average(metrics: List[Tuple[int, Metrics]]) -> Metrics:


# Define strategy
strategy = fl.server.strategy.FedAvg(evaluate_metrics_aggregation_fn=weighted_average)
strategy = FedAvg(evaluate_metrics_aggregation_fn=weighted_average)

# Start Flower server
fl.server.start_server(
server_address="0.0.0.0:8080",
config=fl.server.ServerConfig(num_rounds=3),

# Define config
config = ServerConfig(num_rounds=3)


# Flower ServerApp
app = ServerApp(
config=config,
strategy=strategy,
)


# Legacy mode
if __name__ == "__main__":
from flwr.server import start_server

start_server(
server_address="0.0.0.0:8080",
config=config,
strategy=strategy,
)