Skip to content

praisonai insist using openai and cannot use the local model from ollama #394

@Themisstone

Description

@Themisstone

I want to use the local model to integrate with praisonai

even if I use export OPENAI_BASE_URL=http://localhost:11434/v1
OPENAI_MODEL_NAME=deepseek-r1:14b
OPENAI_API_KEY=NA
The ollama is checked running, however
the praisonAI --init insists to use OPENAI, and always show the error as follow, sometimes it work, but most of the time it never works and insist using OPENAI API, can anyone help, many thanks

[14:14:31] INFO [14:14:31] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.436051 seconds
[14:14:32] INFO [14:14:32] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.882234 seconds
[14:14:33] INFO [14:14:33] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.459925 seconds
INFO [14:14:33] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.901005 seconds
[14:14:34] INFO [14:14:34] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.422548 seconds
[14:14:35] INFO [14:14:35] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.811394 seconds
INFO [14:14:35] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.435414 seconds
[14:14:36] INFO [14:14:36] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.820558 seconds
[14:14:37] INFO [14:14:37] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.404516 seconds
INFO [14:14:37] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.842089 seconds
[14:14:38] INFO [14:14:38] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.484937 seconds
[14:14:39] INFO [14:14:39] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.929251 seconds
[14:14:40] INFO [14:14:40] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.470558 seconds
INFO [14:14:40] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.997868 seconds
[14:14:41] INFO [14:14:41] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.400135 seconds
INFO [14:14:41] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.780188 seconds
[14:14:42] INFO [14:14:42] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.438576 seconds
[14:14:43] INFO [14:14:43] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.938123 seconds
[14:14:44] INFO [14:14:44] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.472849 seconds
INFO [14:14:44] _base_client.py:1099 INFO Retrying request to /chat/completions _base_client.py:1099
in 0.916233 seconds
Traceback (most recent call last):
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
yield
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpx/_transports/default.py", line 250, in handle_request
resp = self._pool.handle_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 256, in handle_request
raise exc from None
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpcore/_sync/connection_pool.py", line 236, in handle_request
response = connection.handle_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 101, in handle_request
raise exc
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 78, in handle_request
stream = self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpcore/_sync/connection.py", line 124, in _connect
stream = self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpcore/_backends/sync.py", line 207, in connect_tcp
with map_exceptions(exc_map):
File "/opt/anaconda3/envs/myenv/lib/python3.11/contextlib.py", line 158, in exit
self.gen.throw(typ, value, traceback)
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: [Errno 8] nodename nor servname provided, or not known

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/openai/_base_client.py", line 1003, in _request
response = self._client.send(
^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpx/_client.py", line 914, in send
response = self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpx/_client.py", line 942, in _send_handling_auth
response = self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
response = self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpx/_client.py", line 1014, in _send_single_request
response = transport.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpx/_transports/default.py", line 249, in handle_request
with map_httpcore_exceptions():
File "/opt/anaconda3/envs/myenv/lib/python3.11/contextlib.py", line 158, in exit
self.gen.throw(typ, value, traceback)
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: [Errno 8] nodename nor servname provided, or not known

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/opt/anaconda3/envs/myenv/bin/praisonai", line 8, in
sys.exit(main())
^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/praisonai/main.py", line 7, in main
praison_ai.main()
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/praisonai/cli.py", line 284, in main
self.agent_file = generator.generate()
^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/praisonai/auto.py", line 130, in generate
response = self.client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/instructor/patch.py", line 143, in new_create_sync
response = retry_sync(
^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/instructor/retry.py", line 151, in retry_sync
for attempt in max_retries:
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/tenacity/_init.py", line 443, in _iter
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/tenacity/init.py", line 376, in iter
result = action(retry_state)
^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/tenacity/init.py", line 418, in exc_check
raise retry_exc.reraise()
^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/tenacity/init.py", line 185, in reraise
raise self.last_attempt.result()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/instructor/retry.py", line 154, in retry_sync
response = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/openai/_utils/_utils.py", line 279, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py", line 879, in create
return self._post(
^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/openai/_base_client.py", line 1290, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/openai/_base_client.py", line 967, in request
return self._request(
^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/openai/_base_client.py", line 1027, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/openai/_base_client.py", line 1105, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/openai/_base_client.py", line 1027, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/openai/_base_client.py", line 1105, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/myenv/lib/python3.11/site-packages/openai/_base_client.py", line 1037, in _request
raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions