Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ TIP: You can also use `camel --help` or `camel <command> --help` to see availabl
| xref:jbang-commands/camel-jbang-dependency.adoc[camel dependency] | Displays all Camel dependencies required to run
| xref:jbang-commands/camel-jbang-dirty.adoc[camel dirty] | Check if there are dirty files from previous Camel runs that did not terminate gracefully
| xref:jbang-commands/camel-jbang-doc.adoc[camel doc] | Shows documentation for kamelet, component, and other Camel resources
| xref:jbang-commands/camel-jbang-explain.adoc[camel explain] | Explain what a Camel route does using AI/LLM
| xref:jbang-commands/camel-jbang-export.adoc[camel export] | Export to other runtimes (Camel Main, Spring Boot, or Quarkus)
| xref:jbang-commands/camel-jbang-get.adoc[camel get] | Get status of Camel integrations
| xref:jbang-commands/camel-jbang-hawtio.adoc[camel hawtio] | Launch Hawtio web console
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@

// AUTO-GENERATED by camel-package-maven-plugin - DO NOT EDIT THIS FILE
= camel explain

Explain what a Camel route does using AI/LLM


== Usage

[source,bash]
----
camel explain [options]
----



== Options

[cols="2,5,1,2",options="header"]
|===
| Option | Description | Default | Type
| `--api-key` | API key for authentication. Also reads OPENAI_API_KEY or LLM_API_KEY env vars | | String
| `--api-type` | API type: 'ollama' or 'openai' (OpenAI-compatible) | ollama | ApiType
| `--catalog-context` | Include Camel Catalog descriptions in the prompt | | boolean
| `--format` | Output format: text, markdown | text | String
| `--model` | Model to use | DEFAULT_MODEL | String
| `--show-prompt` | Show the prompt sent to the LLM | | boolean
| `--stream` | Stream the response as it's generated (shows progress) | true | boolean
| `--system-prompt` | Custom system prompt | | String
| `--temperature` | Temperature for response generation (0.0-2.0) | 0.7 | double
| `--timeout` | Timeout in seconds for LLM response | 120 | int
| `--url` | LLM API endpoint URL. Auto-detected from 'camel infra' for Ollama if not specified. | | String
| `--verbose,-v` | Include detailed technical information | | boolean
| `-h,--help` | Display the help and sub-commands | | boolean
|===



include::partial$jbang-commands/examples/explain.adoc[]

Original file line number Diff line number Diff line change
@@ -0,0 +1,122 @@
== Examples

The `camel explain` command uses AI/LLM to explain Camel routes in plain English.
It supports multiple LLM providers including Ollama (local), OpenAI, Azure OpenAI, vLLM, LM Studio, and LocalAI.

=== Prerequisites

Start Ollama locally using Camel infra:

[source,bash]
----
camel infra run ollama
----

=== Basic Usage

Explain a YAML route:

[source,bash]
----
camel explain my-route.yaml
----

Explain a Java route:

[source,bash]
----
camel explain OrderRoute.java
----

Explain multiple route files:

[source,bash]
----
camel explain route1.yaml route2.xml MyRoute.java
----

=== Output Options

Use verbose mode for detailed technical information:

[source,bash]
----
camel explain my-route.yaml --verbose
----

Output as Markdown for documentation:

[source,bash]
----
camel explain my-route.yaml --format=markdown
----

=== Prompt Options

Include Camel Catalog descriptions for more accurate explanations:

[source,bash]
----
camel explain my-route.yaml --catalog-context
----

Show the prompt sent to the LLM (useful for debugging):

[source,bash]
----
camel explain my-route.yaml --show-prompt
----

Use a custom system prompt:

[source,bash]
----
camel explain my-route.yaml --system-prompt="Focus on error handling and security aspects."
----

=== LLM Configuration

Use OpenAI or compatible services:

[source,bash]
----
camel explain my-route.yaml --url=https://api.openai.com --api-type=openai --api-key=sk-...
----

Use environment variables for the API key:

[source,bash]
----
export OPENAI_API_KEY=sk-...
camel explain my-route.yaml --url=https://api.openai.com --api-type=openai
----

Use a specific model:

[source,bash]
----
camel explain my-route.yaml --model=llama3.1:70b
----

=== Advanced Options

Disable streaming (wait for complete response):

[source,bash]
----
camel explain my-route.yaml --stream=false
----

Adjust temperature (0.0 = deterministic, 2.0 = creative):

[source,bash]
----
camel explain my-route.yaml --temperature=0.3
----

Set a custom timeout (in seconds):

[source,bash]
----
camel explain my-route.yaml --timeout=300
----

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,7 @@ public static void run(CamelJBangMain main, String... args) {
.addSubcommand("update", new CommandLine(new DependencyUpdate(main))))
.addSubcommand("dirty", new CommandLine(new Dirty(main)))
.addSubcommand("export", new CommandLine(new Export(main)))
.addSubcommand("explain", new CommandLine(new Explain(main)))
.addSubcommand("get", new CommandLine(new CamelStatus(main))
.addSubcommand("bean", new CommandLine(new CamelBeanDump(main)))
.addSubcommand("blocked", new CommandLine(new ListBlocked(main)))
Expand Down
Loading
Loading