-
Notifications
You must be signed in to change notification settings - Fork 6.1k
Closed
Labels
Description
Type of issue
Code doesn't work
Description
If you use any of the arguments from the tutorial to use the dotnet new aichatweb template, such as:
dotnet new aichatweb --framework "net9.0" --AiServiceProvider "ollama" --VectorStore "local"This will error out :
Error: Invalid option(s):
--framework
'--framework' is not a valid option
net9.0
'net9.0' is not a valid option
--AiServiceProvider
'--AiServiceProvider' is not a valid option
ollama
'ollama' is not a valid option
--VectorStore
'--VectorStore' is not a valid option
local
'local' is not a valid option
This is apparently because the template has been updated and these parameters have been renamed:
λ dotnet new aichatweb -h
AI Chat Web App (C#)
Author: Microsoft
Description: A project template for creating an AI chat application, which uses retrieval-augmented generation (RAG) to chat with your own data.
Usage:
dotnet new aichatweb [options] [template options]
Options:
-n, --name <name> The name for the output being created. If no name is specified, the name of the output directory is used.
-o, --output <output> Location to place the generated output.
--dry-run Displays a summary of what would happen if the given command line were run if it would result in a template creation.
--force Forces content to be generated even if it would change existing files.
--no-update-check Disables checking for the template package updates when instantiating a template.
--project <project> The project that should be used for context evaluation.
-lang, --language <C#> Specifies the template language to instantiate.
--type <project> Specifies the template type to instantiate.
Template options:
-F, --Framework <net9.0> The target framework for the project.
Type: choice
net9.0 Target net9.0
Default: net9.0
--provider <azureopenai|githubmodels|ollama|openai> Type: choice
azureopenai Uses Azure OpenAI service
githubmodels Uses GitHub Models
ollama Uses Ollama with the llama3.2 and all-minilm models
openai Uses the OpenAI Platform
Default: githubmodels
--vector-store <azureaisearch|local|qdrant> Type: choice
local Uses a JSON file on disk. You can change the implementation to a real vector database
before publishing.
azureaisearch Uses Azure AI Search. This also avoids the need to define a data ingestion pipeline,
since it's managed by Azure AI Search.
qdrant Uses Qdrant in a Docker container, orchestrated using Aspire.
Default: local
--managed-identity Use managed identity to access Azure services
Enabled if: (!UseAspire && VectorStore != "qdrant" && (AiServiceProvider == "azureopenai" ||
AiServiceProvider == "azureaifoundry" || VectorStore == "azureaisearch"))
Type: bool
Default: true
--aspire Create the project as a distributed application using .NET Aspire.
Type: bool
Default: false
-C, --ChatModel <ChatModel> Model/deployment for chat completions. Example: gpt-4o-mini
Type: string
-E, --EmbeddingModel <EmbeddingModel> Model/deployment for embeddings. Example: text-embedding-3-small
Page URL
Content source URL
https://github.com/dotnet/docs/blob/main/docs/ai/quickstarts/ai-templates.md
Document Version Independent Id
fd993daa-ad67-5a5d-873e-96e70b4256c9
Platform Id
0689a818-6031-ad99-46f1-044d1ce482aa
Article author
Metadata
- ID: 9ccd0c8e-4044-19de-f19d-f4478b34e00e
- PlatformId: 0689a818-6031-ad99-46f1-044d1ce482aa
- Service: dotnet
- Sub-service: intelligent-apps