From 7929517a772864a59b4c1346ed353bbba4a3e4c7 Mon Sep 17 00:00:00 2001 From: Kartikay Khandelwal Date: Tue, 16 Apr 2024 14:16:21 -0700 Subject: [PATCH] README fix --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index e3d2a29a6..bc6a6b390 100644 --- a/README.md +++ b/README.md @@ -135,7 +135,7 @@ For distributed training, tune CLI integrates with [torchrun](https://pytorch.or Llama2 7B + LoRA on two GPUs: ```bash -tune run --nproc_per_node 2 full_finetune_distributed --config llama2/7B_full_distributed +tune run --nproc_per_node 2 full_finetune_distributed --config llama2/7B_full ``` > Tip: Make sure to place any torchrun commands **before** the recipe specification. Any CLI args after this will override the config and not impact distributed training.