Skip to content

无法启动成功,无法使用 #1354

@yuzhi-jiang

Description

@yuzhi-jiang

前提ifno: ollama本地部署,已有配置中相关模型,启动时docker-compose.yml相关端口未占用, wrenai version: 0.15.3

问题1:pgsql,password带有符号#时,无法识别,

Image

Image

问题二:无法使用

Image

Image

这是返回值:

{
    "errors": [
        {
            "locations": [
                {
                    "line": 2,
                    "column": 3
                }
            ],
            "path": [
                "createAskingTask"
            ],
            "message": "Cannot read properties of null (reading 'hash')",
            "extensions": {
                "code": "INTERNAL_SERVER_ERROR",
                "message": "Cannot read properties of null (reading 'hash')",
                "shortMessage": "Internal server error"
            }
        }
    ],
    "data": null
}

以下是config.yml

type: llm
provider: litellm_llm
models:
- api_base: http://127.0.0.1:11434/v1  # change this to your ollama host, api_base should be <ollama_url>/v1
  api_key_name: LLM_OLLAMA_API_KEY
  model: openai/deepseek-r1:14b  # openai/<ollama_model_name>
  timeout: 600

---
type: embedder
provider: litellm_embedder
models:
# - model: text-embedding-3-large
#   api_base: https://api.openai.com/v1
#   api_key_name: EMBEDDER_OPENAI_API_KEY
#   timeout: 120
- model: openai/bge-m3:latest  # put your ollama embedder model name here, openai/<ollama_model_name>
  api_base: http://127.0.0.1:11434/v1  # change this to your ollama host, api_base should be <ollama_url>/v1
  api_key_name: EMBEDDER_OLLAMA_API_KEY
  timeout: 600
---
type: engine
provider: wren_ui
endpoint: http://wren-ui:3000

---
type: document_store
provider: qdrant
location: http://qdrant:6333
embedding_model_dim: 3072
timeout: 120
recreate_index: true

---
type: pipeline
pipes:
  - name: db_schema_indexing
    embedder: litellm_embedder.openai/bge-m3:latest
    document_store: qdrant
  - name: historical_question_indexing
    embedder: litellm_embedder.openai/bge-m3:latest
    document_store: qdrant
  - name: table_description_indexing
    embedder: litellm_embedder.openai/bge-m3:latest
    document_store: qdrant
  - name: db_schema_retrieval
    llm: litellm_llm.openai/deepseek-r1:14b
    embedder: litellm_embedder.openai/bge-m3:latest
    document_store: qdrant
  - name: historical_question_retrieval
    embedder: litellm_embedder.openai/bge-m3:latest
    document_store: qdrant
  - name: sql_generation
    llm: litellm_llm.openai/deepseek-r1:14b
    engine: wren_ui
  - name: sql_correction
    llm: litellm_llm.openai/deepseek-r1:14b
    engine: wren_ui
  - name: followup_sql_generation
    llm: litellm_llm.openai/deepseek-r1:14b
    engine: wren_ui
  - name: sql_summary
    llm: litellm_llm.openai/deepseek-r1:14b
  - name: sql_answer
    llm: litellm_llm.openai/deepseek-r1:14b
    engine: wren_ui
  - name: sql_breakdown
    llm: litellm_llm.openai/deepseek-r1:14b
    engine: wren_ui
  - name: sql_expansion
    llm: litellm_llm.openai/deepseek-r1:14b
    engine: wren_ui
  - name: sql_explanation
    llm: litellm_llm.openai/deepseek-r1:14b
  - name: semantics_description
    llm: litellm_llm.openai/deepseek-r1:14b
  - name: relationship_recommendation
    llm: litellm_llm.openai/deepseek-r1:14b
    engine: wren_ui
  - name: question_recommendation
    llm: litellm_llm.openai/deepseek-r1:14b
  - name: question_recommendation_db_schema_retrieval
    llm: litellm_llm.openai/deepseek-r1:14b
    embedder: litellm_embedder.openai/bge-m3:latest
    document_store: qdrant
  - name: question_recommendation_sql_generation
    llm: litellm_llm.openai/deepseek-r1:14b
    engine: wren_ui
  - name: chart_generation
    llm: litellm_llm.openai/deepseek-r1:14b
  - name: chart_adjustment
    llm: litellm_llm.openai/deepseek-r1:14b
  - name: intent_classification
    llm: litellm_llm.openai/deepseek-r1:14b
    embedder: litellm_embedder.openai/bge-m3:latest
    document_store: qdrant
  - name: data_assistance
    llm: litellm_llm.openai/deepseek-r1:14b
  - name: sql_pairs_indexing
    document_store: qdrant
    embedder: litellm_embedder.openai/bge-m3:latest
  - name: sql_pairs_deletion
    document_store: qdrant
    embedder: litellm_embedder.openai/bge-m3:latest
  - name: sql_pairs_retrieval
    document_store: qdrant
    embedder: litellm_embedder.openai/bge-m3:latest
    llm: litellm_llm.openai/deepseek-r1:14b
  - name: preprocess_sql_data
    llm: litellm_llm.openai/deepseek-r1:14b
  - name: sql_executor
    engine: wren_ui
  - name: sql_question_generation
    llm: litellm_llm.openai/deepseek-r1:14b
  - name: sql_generation_reasoning
    llm: litellm_llm.openai/deepseek-r1:14b
  - name: sql_regeneration
    llm: litellm_llm.openai/deepseek-r1:14b
    engine: wren_ui

---
settings:
  column_indexing_batch_size: 50
  table_retrieval_size: 10
  table_column_retrieval_size: 100
  allow_using_db_schemas_without_pruning: false
  query_cache_maxsize: 1000
  query_cache_ttl: 3600
  langfuse_host: https://cloud.langfuse.com
  langfuse_enable: true
  logging_level: DEBUG
  development: false

.env配置:

COMPOSE_PROJECT_NAME=wrenai
PLATFORM=linux/amd64

PROJECT_DIR=.

# service port
WREN_ENGINE_PORT=8080
WREN_ENGINE_SQL_PORT=7432
WREN_AI_SERVICE_PORT=5555
WREN_UI_PORT=3000
IBIS_SERVER_PORT=8000
WREN_UI_ENDPOINT=http://wren-ui:${WREN_UI_PORT}

# ai service settings
QDRANT_HOST=qdrant
SHOULD_FORCE_DEPLOY=1

# vendor keys
LLM_OPENAI_API_KEY=
EMBEDDER_OPENAI_API_KEY=
LLM_AZURE_OPENAI_API_KEY=
EMBEDDER_AZURE_OPENAI_API_KEY=
QDRANT_API_KEY=
LLM_OLLAMA_API_KEY=xxx
EMBEDDER_OLLAMA_API_KEY=xxx

# version
# CHANGE THIS TO THE LATEST VERSION
WREN_PRODUCT_VERSION=0.15.3
WREN_ENGINE_VERSION=0.13.1
WREN_AI_SERVICE_VERSION=0.15.7
IBIS_SERVER_VERSION=0.13.1
WREN_UI_VERSION=0.20.1
WREN_BOOTSTRAP_VERSION=0.1.5

# user id (uuid v4)
USER_UUID=

# for other services
POSTHOG_API_KEY=phc_nhF32aj4xHXOZb0oqr2cn4Oy9uiWzz6CCP4KZmRq9aE
POSTHOG_HOST=https://app.posthog.com
TELEMETRY_ENABLED=true
# this is for telemetry to know the model, i think ai-service might be able to provide a endpoint to get the information
GENERATION_MODEL=gpt-4o-mini
LANGFUSE_SECRET_KEY=
LANGFUSE_PUBLIC_KEY=

# the port exposes to the host
# OPTIONAL: change the port if you have a conflict
HOST_PORT=3000
AI_SERVICE_FORWARD_PORT=5555

# Wren UI
EXPERIMENTAL_ENGINE_RUST_VERSION=false

以下是一些容器的log:

Image

Image

wrenai-wren-ai-service-1

2025-03-04 14:24:02 INFO:     Started server process [9]
2025-03-04 14:24:02 INFO:     Waiting for application startup.
2025-03-04 14:24:02 I0304 06:24:02.642 9 wren-ai-service:40] Imported Provider: src.providers.document_store
2025-03-04 14:24:02 I0304 06:24:02.646 9 wren-ai-service:64] Registering provider: qdrant
2025-03-04 14:24:02 I0304 06:24:02.646 9 wren-ai-service:40] Imported Provider: src.providers.document_store.qdrant
2025-03-04 14:24:02 I0304 06:24:02.647 9 wren-ai-service:40] Imported Provider: src.providers.embedder
2025-03-04 14:24:03 I0304 06:24:03.547 9 wren-ai-service:64] Registering provider: azure_openai_embedder
2025-03-04 14:24:03 I0304 06:24:03.547 9 wren-ai-service:40] Imported Provider: src.providers.embedder.azure_openai
2025-03-04 14:24:04 /app/.venv/lib/python3.12/site-packages/pydantic/_internal/_config.py:345: UserWarning: Valid config keys have changed in V2:
2025-03-04 14:24:04 * 'fields' has been removed
2025-03-04 14:24:04   warnings.warn(message, UserWarning)
2025-03-04 14:24:05 I0304 06:24:05.271 9 wren-ai-service:64] Registering provider: litellm_embedder
2025-03-04 14:24:05 I0304 06:24:05.271 9 wren-ai-service:40] Imported Provider: src.providers.embedder.litellm
2025-03-04 14:24:05 I0304 06:24:05.276 9 wren-ai-service:64] Registering provider: ollama_embedder
2025-03-04 14:24:05 I0304 06:24:05.276 9 wren-ai-service:40] Imported Provider: src.providers.embedder.ollama
2025-03-04 14:24:05 I0304 06:24:05.287 9 wren-ai-service:64] Registering provider: openai_embedder
2025-03-04 14:24:05 I0304 06:24:05.288 9 wren-ai-service:40] Imported Provider: src.providers.embedder.openai
2025-03-04 14:24:05 I0304 06:24:05.289 9 wren-ai-service:40] Imported Provider: src.providers.engine
2025-03-04 14:24:05 I0304 06:24:05.291 9 wren-ai-service:64] Registering provider: wren_ui
2025-03-04 14:24:05 I0304 06:24:05.292 9 wren-ai-service:64] Registering provider: wren_ibis
2025-03-04 14:24:05 I0304 06:24:05.293 9 wren-ai-service:64] Registering provider: wren_engine
2025-03-04 14:24:05 I0304 06:24:05.293 9 wren-ai-service:40] Imported Provider: src.providers.engine.wren
2025-03-04 14:24:05 I0304 06:24:05.295 9 wren-ai-service:40] Imported Provider: src.providers.llm
2025-03-04 14:24:05 I0304 06:24:05.318 9 wren-ai-service:64] Registering provider: azure_openai_llm
2025-03-04 14:24:05 I0304 06:24:05.318 9 wren-ai-service:40] Imported Provider: src.providers.llm.azure_openai
2025-03-04 14:24:05 I0304 06:24:05.319 9 wren-ai-service:64] Registering provider: litellm_llm
2025-03-04 14:24:05 I0304 06:24:05.321 9 wren-ai-service:40] Imported Provider: src.providers.llm.litellm
2025-03-04 14:24:05 I0304 06:24:05.332 9 wren-ai-service:64] Registering provider: ollama_llm
2025-03-04 14:24:05 I0304 06:24:05.332 9 wren-ai-service:40] Imported Provider: src.providers.llm.ollama
2025-03-04 14:24:05 I0304 06:24:05.464 9 wren-ai-service:64] Registering provider: openai_llm
2025-03-04 14:24:05 I0304 06:24:05.464 9 wren-ai-service:40] Imported Provider: src.providers.llm.openai
2025-03-04 14:24:05 I0304 06:24:05.465 9 wren-ai-service:40] Imported Provider: src.providers.loader
2025-03-04 14:24:05 ERROR:    Traceback (most recent call last):
2025-03-04 14:24:05   File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 693, in lifespan
2025-03-04 14:24:05     async with self.lifespan_context(app) as maybe_state:
2025-03-04 14:24:05   File "/usr/local/lib/python3.12/contextlib.py", line 204, in __aenter__
2025-03-04 14:24:05     return await anext(self.gen)
2025-03-04 14:24:05            ^^^^^^^^^^^^^^^^^^^^^
2025-03-04 14:24:05   File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
2025-03-04 14:24:05     async with original_context(app) as maybe_original_state:
2025-03-04 14:24:05   File "/usr/local/lib/python3.12/contextlib.py", line 204, in __aenter__
2025-03-04 14:24:05     return await anext(self.gen)
2025-03-04 14:24:05            ^^^^^^^^^^^^^^^^^^^^^
2025-03-04 14:24:05   File "/src/__main__.py", line 31, in lifespan
2025-03-04 14:24:05     pipe_components = generate_components(settings.components)
2025-03-04 14:24:05                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-03-04 14:24:05   File "/src/providers/__init__.py", line 332, in generate_components
2025-03-04 14:24:05     config = transform(configs)
2025-03-04 14:24:05              ^^^^^^^^^^^^^^^^^^
2025-03-04 14:24:05   File "/src/providers/__init__.py", line 294, in transform
2025-03-04 14:24:05     converted = processor(entry)
2025-03-04 14:24:05                 ^^^^^^^^^^^^^^^^
2025-03-04 14:24:05   File "/src/providers/__init__.py", line 78, in llm_processor
2025-03-04 14:24:05     "kwargs": model["kwargs"],
2025-03-04 14:24:05               ~~~~~^^^^^^^^^^
2025-03-04 14:24:05 KeyError: 'kwargs'
2025-03-04 14:24:05 
2025-03-04 14:24:05 ERROR:    Application startup failed. Exiting.
2025-03-04 14:25:02 INFO:     Started server process [8]
2025-03-04 14:25:02 INFO:     Waiting for application startup.
2025-03-04 14:25:02 I0304 06:25:02.173 8 wren-ai-service:40] Imported Provider: src.providers.document_store
2025-03-04 14:25:02 I0304 06:25:02.176 8 wren-ai-service:64] Registering provider: qdrant
2025-03-04 14:25:02 I0304 06:25:02.176 8 wren-ai-service:40] Imported Provider: src.providers.document_store.qdrant
2025-03-04 14:25:02 I0304 06:25:02.177 8 wren-ai-service:40] Imported Provider: src.providers.embedder
2025-03-04 14:25:02 I0304 06:25:02.732 8 wren-ai-service:64] Registering provider: azure_openai_embedder
2025-03-04 14:25:02 I0304 06:25:02.732 8 wren-ai-service:40] Imported Provider: src.providers.embedder.azure_openai
2025-03-04 14:25:03 /app/.venv/lib/python3.12/site-packages/pydantic/_internal/_config.py:345: UserWarning: Valid config keys have changed in V2:
2025-03-04 14:25:03 * 'fields' has been removed
2025-03-04 14:25:03   warnings.warn(message, UserWarning)
2025-03-04 14:25:04 I0304 06:25:04.114 8 wren-ai-service:64] Registering provider: litellm_embedder
2025-03-04 14:25:04 I0304 06:25:04.115 8 wren-ai-service:40] Imported Provider: src.providers.embedder.litellm
2025-03-04 14:25:04 I0304 06:25:04.117 8 wren-ai-service:64] Registering provider: ollama_embedder
2025-03-04 14:25:04 I0304 06:25:04.117 8 wren-ai-service:40] Imported Provider: src.providers.embedder.ollama
2025-03-04 14:25:04 I0304 06:25:04.120 8 wren-ai-service:64] Registering provider: openai_embedder
2025-03-04 14:25:04 I0304 06:25:04.120 8 wren-ai-service:40] Imported Provider: src.providers.embedder.openai
2025-03-04 14:25:04 I0304 06:25:04.121 8 wren-ai-service:40] Imported Provider: src.providers.engine
2025-03-04 14:25:04 I0304 06:25:04.122 8 wren-ai-service:64] Registering provider: wren_ui
2025-03-04 14:25:04 I0304 06:25:04.123 8 wren-ai-service:64] Registering provider: wren_ibis
2025-03-04 14:25:04 I0304 06:25:04.123 8 wren-ai-service:64] Registering provider: wren_engine
2025-03-04 14:25:04 I0304 06:25:04.123 8 wren-ai-service:40] Imported Provider: src.providers.engine.wren
2025-03-04 14:25:04 I0304 06:25:04.125 8 wren-ai-service:40] Imported Provider: src.providers.llm
2025-03-04 14:25:04 I0304 06:25:04.133 8 wren-ai-service:64] Registering provider: azure_openai_llm
2025-03-04 14:25:04 I0304 06:25:04.133 8 wren-ai-service:40] Imported Provider: src.providers.llm.azure_openai
2025-03-04 14:25:04 I0304 06:25:04.134 8 wren-ai-service:64] Registering provider: litellm_llm
2025-03-04 14:25:04 I0304 06:25:04.134 8 wren-ai-service:40] Imported Provider: src.providers.llm.litellm
2025-03-04 14:25:04 I0304 06:25:04.137 8 wren-ai-service:64] Registering provider: ollama_llm
2025-03-04 14:25:04 I0304 06:25:04.138 8 wren-ai-service:40] Imported Provider: src.providers.llm.ollama
2025-03-04 14:25:04 I0304 06:25:04.195 8 wren-ai-service:64] Registering provider: openai_llm
2025-03-04 14:25:04 I0304 06:25:04.195 8 wren-ai-service:40] Imported Provider: src.providers.llm.openai
2025-03-04 14:25:04 I0304 06:25:04.196 8 wren-ai-service:40] Imported Provider: src.providers.loader
2025-03-04 14:25:04 ERROR:    Traceback (most recent call last):
2025-03-04 14:25:04   File "/app/.venv/lib/python3.12/site-packages/starlette/routing.py", line 693, in lifespan
2025-03-04 14:25:04     async with self.lifespan_context(app) as maybe_state:
2025-03-04 14:25:04   File "/usr/local/lib/python3.12/contextlib.py", line 204, in __aenter__
2025-03-04 14:25:04     return await anext(self.gen)
2025-03-04 14:25:04            ^^^^^^^^^^^^^^^^^^^^^
2025-03-04 14:25:04   File "/app/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 133, in merged_lifespan
2025-03-04 14:25:04     async with original_context(app) as maybe_original_state:
2025-03-04 14:25:04   File "/usr/local/lib/python3.12/contextlib.py", line 204, in __aenter__
2025-03-04 14:25:04     return await anext(self.gen)
2025-03-04 14:25:04            ^^^^^^^^^^^^^^^^^^^^^
2025-03-04 14:25:04   File "/src/__main__.py", line 31, in lifespan
2025-03-04 14:25:04     pipe_components = generate_components(settings.components)
2025-03-04 14:25:04                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-03-04 14:25:04   File "/src/providers/__init__.py", line 332, in generate_components
2025-03-04 14:25:04     config = transform(configs)
2025-03-04 14:25:04              ^^^^^^^^^^^^^^^^^^
2025-03-04 14:25:04   File "/src/providers/__init__.py", line 294, in transform
2025-03-04 14:25:04     converted = processor(entry)
2025-03-04 14:25:04                 ^^^^^^^^^^^^^^^^
2025-03-04 14:25:04   File "/src/providers/__init__.py", line 78, in llm_processor
2025-03-04 14:25:04     "kwargs": model["kwargs"],
2025-03-04 14:25:04               ~~~~~^^^^^^^^^^
2025-03-04 14:25:04 KeyError: 'kwargs'
2025-03-04 14:25:04 
2025-03-04 14:25:04 ERROR:    Application startup failed. Exiting.

Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions