From 4b37c7c9bf43b7d1eb3cc1b0700add2b4c6e2a28 Mon Sep 17 00:00:00 2001 From: "docsalot-app[bot]" <207601912+docsalot-app[bot]@users.noreply.github.com> Date: Fri, 17 Oct 2025 04:56:26 +0000 Subject: [PATCH 1/5] docs: update cli-reference.mdx for changes #89 --- cli-reference.mdx | 239 ++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 239 insertions(+) create mode 100644 cli-reference.mdx diff --git a/cli-reference.mdx b/cli-reference.mdx new file mode 100644 index 0000000..4a64125 --- /dev/null +++ b/cli-reference.mdx @@ -0,0 +1,239 @@ +--- +title: CLI Reference +description: Complete reference for Magemaker command-line options +--- + +## Overview + +Magemaker provides a command-line interface for deploying and managing AI models across AWS, GCP, and Azure. This page documents all available CLI options. + +## Basic Usage + +```sh +magemaker [OPTIONS] +``` + +## Command-Line Options + +### Cloud Configuration + +#### `--cloud [aws|gcp|azure|all]` + +Configure and initialize Magemaker for your cloud provider(s). + +```sh +magemaker --cloud aws # Configure AWS SageMaker +magemaker --cloud gcp # Configure GCP Vertex AI +magemaker --cloud azure # Configure Azure ML +magemaker --cloud all # Configure all providers +``` + +**Description**: Launches an interactive menu for configuring cloud credentials and deploying models. On first run, prompts for necessary credentials. + +**Use cases**: +- Initial setup and configuration +- Interactive model deployment +- Querying deployed endpoints +- Managing and deleting endpoints + +--- + +### Deployment + +#### `--deploy ` + +Deploy a model using a YAML configuration file. + +```sh +magemaker --deploy .magemaker_config/your-model.yaml +``` + +**Arguments**: +- ``: Path to YAML deployment configuration file + +**Description**: Deploy models using Infrastructure as Code approach. Recommended for production deployments and CI/CD pipelines. + +**Example YAML**: +```yaml +deployment: !Deployment + destination: aws + endpoint_name: my-model-endpoint + instance_count: 1 + instance_type: ml.m5.xlarge + +models: + - !Model + id: google-bert/bert-base-uncased + source: huggingface +``` + +See [Deployment Guide](/concepts/deployment) for complete YAML reference. + +--- + +#### `--hf ` + +Deploy a specific Hugging Face model directly from the command line. + +```sh +magemaker --hf facebook/opt-125m --instance ml.m5.xlarge +``` + +**Arguments**: +- ``: Hugging Face model identifier (e.g., `facebook/opt-125m`) + +**Description**: Quick deployment of Hugging Face models without creating a YAML file. Can be combined with `--instance` and `--cpu` flags for custom configurations. + +--- + +#### `--instance ` + +Specify the instance type for deployment. + +```sh +magemaker --hf facebook/opt-125m --instance ml.g5.2xlarge +``` + +**Arguments**: +- ``: Cloud provider instance type + - **AWS**: `ml.m5.xlarge`, `ml.g5.2xlarge`, etc. + - **GCP**: `n1-standard-4`, `g2-standard-12`, etc. + - **Azure**: `Standard_DS3_v2`, `Standard_NC6s_v3`, etc. + +**Description**: Override default instance type for model deployment. Must be valid for the selected cloud provider. + +**Related**: See [Deployment Guide](/concepts/deployment#cloud-specific-instance-types) for available instance types. + +--- + +#### `--cpu ` + +Specify the CPU type for deployment. + +```sh +magemaker --hf facebook/opt-125m --cpu intel --instance ml.m5.xlarge +``` + +**Arguments**: +- ``: CPU architecture or type specification + +**Description**: Specify CPU type for deployments. Useful for optimizing performance or meeting specific compute requirements. + + +This option was added in PR #89 to provide more granular control over deployment configurations. + + +--- + +### Training + +#### `--train ` + +Fine-tune a model using a training configuration file. + +```sh +magemaker --train .magemaker_config/train-config.yaml +``` + +**Arguments**: +- ``: Path to YAML training configuration file + +**Description**: Fine-tune pre-trained models with your own data. + +**Example YAML**: +```yaml +training: !Training + destination: aws + instance_type: ml.p3.2xlarge + instance_count: 1 + training_input_path: s3://your-bucket/data.csv + hyperparameters: !Hyperparameters + epochs: 3 + per_device_train_batch_size: 32 + learning_rate: 2e-5 +``` + +See [Fine-tuning Guide](/concepts/fine-tuning) for more details. + +--- + +### Utility Options + +#### `--version` + +Display the current version of Magemaker and exit. + +```sh +magemaker --version +``` + +--- + +## Common Usage Patterns + +### Quick Deployment + +Deploy a Hugging Face model with specific instance type: + +```sh +magemaker --hf google-bert/bert-base-uncased --instance ml.m5.xlarge +``` + +### Production Deployment + +Use YAML configuration for reproducible deployments: + +```sh +magemaker --deploy .magemaker_config/production-model.yaml +``` + +### Multi-Cloud Setup + +Configure all cloud providers at once: + +```sh +magemaker --cloud all +``` + +### Fine-tuning + +Train a model with custom data: + +```sh +magemaker --train .magemaker_config/train-bert.yaml +``` + +--- + +## Combining Options + +Some flags can be combined for more specific deployments: + +```sh +# Deploy Hugging Face model with custom instance and CPU type +magemaker --hf facebook/opt-125m --instance ml.m5.xlarge --cpu intel + +# Note: --deploy cannot be combined with --hf as they are different deployment methods +``` + +--- + +## Exit Codes + +- `0`: Success +- `1`: General error +- Other codes may indicate specific failures + +--- + +## Environment Variables + +Magemaker reads configuration from a `.env` file. See [Environment Configuration](/configuration/Environment) for details. + +--- + +## See Also + +- [Quick Start](/quick-start) - Get started with Magemaker +- [Deployment Guide](/concepts/deployment) - Detailed deployment documentation +- [Configuration Guides](/configuration/AWS) - Cloud-specific setup instructions From a76f4a2cd3317010152b655397e8d7bfd5cd8619 Mon Sep 17 00:00:00 2001 From: "docsalot-app[bot]" <207601912+docsalot-app[bot]@users.noreply.github.com> Date: Fri, 17 Oct 2025 04:56:27 +0000 Subject: [PATCH 2/5] docs: update installation.mdx for changes #89 --- installation.mdx | 3 +++ 1 file changed, 3 insertions(+) diff --git a/installation.mdx b/installation.mdx index 1d843eb..1a96525 100644 --- a/installation.mdx +++ b/installation.mdx @@ -15,6 +15,9 @@ Install via pip: pip install magemaker ``` + + After installation, check out the [CLI Reference](/cli-reference) for a complete guide to all available command-line options. + ## Cloud Account Setup From 823448560d812e876be682853789089788331e8f Mon Sep 17 00:00:00 2001 From: "docsalot-app[bot]" <207601912+docsalot-app[bot]@users.noreply.github.com> Date: Fri, 17 Oct 2025 04:56:28 +0000 Subject: [PATCH 3/5] docs: update mint.json for changes #89 --- mint.json | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/mint.json b/mint.json index ccb1843..951b693 100644 --- a/mint.json +++ b/mint.json @@ -38,7 +38,7 @@ "mode": "auto" }, "navigation": [ - { + { "group": "Getting Started", "pages": ["about", "installation", "quick-start"] }, @@ -66,6 +66,10 @@ "concepts/models", "concepts/contributing" ] + }, + { + "group": "Reference", + "pages": ["cli-reference"] } ], "footerSocials": { From 9ebd1f0c0906fb9458f8c274c5a20c21f2181a5b Mon Sep 17 00:00:00 2001 From: "docsalot-app[bot]" <207601912+docsalot-app[bot]@users.noreply.github.com> Date: Fri, 17 Oct 2025 04:56:28 +0000 Subject: [PATCH 4/5] docs: update quick-start.mdx for changes #89 --- quick-start.mdx | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/quick-start.mdx b/quick-start.mdx index 5853ef8..900a663 100644 --- a/quick-start.mdx +++ b/quick-start.mdx @@ -21,6 +21,10 @@ Supported providers: - `--cloud azure` Azure Machine Learning deployment - `--cloud all` Configure all three providers at the same time + + For a complete reference of all CLI options including `--hf`, `--instance`, and `--cpu` flags, see the [CLI Reference](/cli-reference). + + ### List Models From 649174f2a43297dd2d744c4af4a166128779b77c Mon Sep 17 00:00:00 2001 From: "docsalot-app[bot]" <207601912+docsalot-app[bot]@users.noreply.github.com> Date: Fri, 17 Oct 2025 04:56:29 +0000 Subject: [PATCH 5/5] docs: update concepts/deployment.mdx for changes #89 --- concepts/deployment.mdx | 22 ++++++++++++++++++++++ 1 file changed, 22 insertions(+) diff --git a/concepts/deployment.mdx b/concepts/deployment.mdx index 66ca7a9..c3fbc60 100644 --- a/concepts/deployment.mdx +++ b/concepts/deployment.mdx @@ -21,6 +21,28 @@ This method is great for: - Exploring available models - Testing different configurations +### Command-Line Deployment + +For quick deployments, you can use command-line flags to deploy Hugging Face models directly: + +```sh +magemaker --hf facebook/opt-125m --instance ml.m5.xlarge --cpu intel +``` + +Available flags: +- `--hf `: Specify a Hugging Face model to deploy +- `--instance `: Set the instance type (e.g., `ml.m5.xlarge`, `n1-standard-4`) +- `--cpu `: Specify CPU type for optimized performance + +This method is ideal for: +- Quick testing and experimentation +- One-off deployments +- Scripts and automation + + + See the [CLI Reference](/cli-reference) for complete documentation of all command-line options. + + ### YAML-based Deployment For reproducible deployments and CI/CD integration, use YAML configuration files: