From 391e079b36437030ac7f2c5ddf83b70096bfec02 Mon Sep 17 00:00:00 2001 From: Stomzy <45337015+airamare01@users.noreply.github.com> Date: Thu, 26 Sep 2024 15:51:09 +0300 Subject: [PATCH] Add FAQ for using model of preference on Cody CLI This update introduces an answer to a question where users would like to find out if they can use a model of choice to chat with Cody on CLI. --- docs/cody/faq.mdx | 10 ++++++++++ 1 file changed, 10 insertions(+) diff --git a/docs/cody/faq.mdx b/docs/cody/faq.mdx index 8abb5f999..35d51d87f 100644 --- a/docs/cody/faq.mdx +++ b/docs/cody/faq.mdx @@ -114,3 +114,13 @@ Yes, Cody supports the following cloud development environments: - vscode.dev and GitHub Codespaces (install from the VS Code extension marketplace) - Any editor supporting the [Open VSX Registry](https://open-vsx.org/extension/sourcegraph/cody-ai), including [Gitpod](https://www.gitpod.io/blog/boosting-developer-productivity-unleashing-the-power-of-sourcegraph-cody-in-gitpod), Coder, and `code-server` (install from the [Open VSX Registry](https://open-vsx.org/extension/sourcegraph/cody-ai)) + +### Can I use my LLM of preference to chat with Cody on CLI? + +Yes you can. In the CLI you can use the following command to get started. Please replace `$name_of_the_model` with the LLM model of your choice. + +``` +cody chat --model '$name_of_the_model' -m 'Hi Cody!' +``` + +For example, to use Claude 3.5 Sonnet, you'd pass the following command in your terminal, `cody chat --model 'claude-3.5-sonnet' -m 'Hi Cody!'