Skip to content

Feature/create dedicated inference#1818

Merged
anup-deka merged 3 commits intodigitalocean:mainfrom
rarora-it:main
Mar 20, 2026
Merged

Feature/create dedicated inference#1818
anup-deka merged 3 commits intodigitalocean:mainfrom
rarora-it:main

Conversation

@rarora-it
Copy link
Contributor

Adds CLI support for creating dedicated inference endpoints via
doctl dedicated-inference create --spec spec.yaml
doctl dedicated-inference create --spec spec.yaml --hugging-face-token "hf_mytoken"

TestDedicatedInferenceCommand - Verifies the command tree: dedicated-inference exists with create subcommand
TestRunDedicatedInferenceCreate - Creates from a JSON spec file without secrets; asserts the correct DedicatedInferenceCreateRequest (spec with VPC, model deployments, accelerators) is passed to the service
TestRunDedicatedInferenceCreate_WithHuggingFaceToken - Same as above but also sets hugging face token

anup-deka
anup-deka previously approved these changes Mar 20, 2026
@anup-deka anup-deka merged commit b72eaac into digitalocean:main Mar 20, 2026
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants