Description
Short error message:
File "/Users/rameshreddy/NLWeb/code/tools/db_load.py", line 724, in loadJsonToDB
endpoint_name = database or CONFIG.preferred_retrieval_endpoint
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'AppConfig' object has no attribute 'preferred_retrieval_endpoint'
Detailed log message : error message during load process.
(myenv) rameshreddy@Mac code % python3 -m tools.db_load https://feeds.libsyn.com/121695/rss Behind-the-Tech
Fetching content from URL: https://feeds.libsyn.com/121695/rss
Fetching content from URL: https://feeds.libsyn.com/121695/rss
Saved URL content to temporary file: /var/folders/_g/xwl6wrh96vl2ksvwpzw9cml40000gp/T/tmpc6grv434.xml (type: rss)
Detected file type: rss, contains embeddings: No
Computing embeddings for file...
Cleaned up temporary file: /var/folders/_g/xwl6wrh96vl2ksvwpzw9cml40000gp/T/tmpc6grv434.xml
Traceback (most recent call last):
File "", line 198, in _run_module_as_main
File "", line 88, in _run_code
File "/Users/rameshreddy/NLWeb/code/tools/db_load.py", line 1170, in
asyncio.run(main())
~~~~~~~~~~~^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/asyncio/runners.py", line 194, in run
return runner.run(main)
~~~~~~~~~~^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/asyncio/base_events.py", line 721, in run_until_complete
return future.result()
~~~~~~~~~~~~~^^
File "/Users/rameshreddy/NLWeb/code/tools/db_load.py", line 1156, in main
await loadJsonToDB(file_path, args.site, args.batch_size, args.delete_site, args.force_recompute, args.database)
File "/Users/rameshreddy/NLWeb/code/tools/db_load.py", line 724, in loadJsonToDB
endpoint_name = database or CONFIG.preferred_retrieval_endpoint
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'AppConfig' object has no attribute 'preferred_retrieval_endpoint'
Activity
ViktorS898 commentedon Jun 16, 2025
Encountered the same issue
jennifermarsman commentedon Jun 16, 2025
Acknowledge, thanks for raising. I will try to repro now.
jennifermarsman commentedon Jun 16, 2025
Yes, it looks like a bug got merged. We will fix; thank you!
jennifermarsman commentedon Jun 16, 2025
The breaking change has been temporarily backed out: #213
rameshcreddy commentedon Jun 16, 2025
Thanks for the quick turn around. I tried again, but I am seeing a different error during the data load process. pls advise. I modified config_embedding.yaml, config_llm.yaml and config_retrieval.yaml and provided open_ai key in .env file per instructions.
Brief error messages
Error processing batch: Connection error.
2025-06-16 09:59:27,398 - azure_oai_embedding - ERROR - exception:167 - Error generating Azure OpenAI batch embeddings
NoneType: None
2025-06-16 09:59:27,402 - azure_oai_embedding - ERROR - log_with_context:181 - Azure OpenAI batch embedding generation failed | Context: model=text-embedding-3-small - batch_size=74 - error_type=APIConnectionError - error_message=Connection error.
2025-06-16 09:59:27,403 - embedding_wrapper - ERROR - exception:167 - Error during batch embedding generation with provider azure_openai
NoneType: None
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/openai/_base_client.py", line 1748, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/openai/_base_client.py", line 1522, in request
raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.
Processed 74/74 documents
Loading completed. Added 0 documents to the database.
Detailed Error log :
(myenv) rameshreddy@Mac code % python -m tools.db_load https://feeds.libsyn.com/121695/rss Behind-the-Tech
Fetching content from URL: https://feeds.libsyn.com/121695/rss
Fetching content from URL: https://feeds.libsyn.com/121695/rss
Saved URL content to temporary file: /var/folders/_g/xwl6wrh96vl2ksvwpzw9cml40000gp/T/tmp29w4iusa.xml (type: rss)
Detected file type: rss, contains embeddings: No
Computing embeddings for file...
Loading data from /var/folders/_g/xwl6wrh96vl2ksvwpzw9cml40000gp/T/tmp29w4iusa.xml (resolved to /var/folders/_g/xwl6wrh96vl2ksvwpzw9cml40000gp/T/tmp29w4iusa.xml) for site Behind-the-Tech using database endpoint 'nlweb_west'
Detected file type: rss
Using embedding provider: azure_openai, model: text-embedding-3-small
Processing as RSS feed...
Processing RSS/Atom feed: /var/folders/_g/xwl6wrh96vl2ksvwpzw9cml40000gp/T/tmp29w4iusa.xml
Processed 74 episodes from RSS/Atom feed
Computing embeddings for batch of 74 texts
Error processing batch: Connection error.
2025-06-16 09:59:27,398 - azure_oai_embedding - ERROR - exception:167 - Error generating Azure OpenAI batch embeddings
NoneType: None
2025-06-16 09:59:27,402 - azure_oai_embedding - ERROR - log_with_context:181 - Azure OpenAI batch embedding generation failed | Context: model=text-embedding-3-small - batch_size=74 - error_type=APIConnectionError - error_message=Connection error.
2025-06-16 09:59:27,403 - embedding_wrapper - ERROR - exception:167 - Error during batch embedding generation with provider azure_openai
NoneType: None
2025-06-16 09:59:27,406 - embedding_wrapper - ERROR - log_with_context:181 - Batch embedding generation failed | Context: provider=azure_openai - model=text-embedding-3-small - batch_size=74 - error_type=APIConnectionError - error_message=Connection error.
Traceback (most recent call last):
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpx/_transports/default.py", line 101, in map_httpcore_exceptions
yield
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpx/_transports/default.py", line 394, in handle_async_request
resp = await self._pool.handle_async_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpcore/_async/connection_pool.py", line 256, in handle_async_request
raise exc from None
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpcore/_async/connection_pool.py", line 236, in handle_async_request
response = await connection.handle_async_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
pool_request.request
^^^^^^^^^^^^^^^^^^^^
)
^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpcore/_async/connection.py", line 101, in handle_async_request
raise exc
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpcore/_async/connection.py", line 78, in handle_async_request
stream = await self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpcore/_async/connection.py", line 124, in _connect
stream = await self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpcore/_backends/auto.py", line 31, in connect_tcp
return await self._backend.connect_tcp(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<5 lines>...
)
^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpcore/_backends/anyio.py", line 113, in connect_tcp
with map_exceptions(exc_map):
~~~~~~~~~~~~~~^^^^^^^^^
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/contextlib.py", line 162, in exit
self.gen.throw(value)
~~~~~~~~~~~~~~^^^^^^^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: [Errno 8] nodename nor servname provided, or not known
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/openai/_base_client.py", line 1490, in request
response = await self._client.send(
^^^^^^^^^^^^^^^^^^^^^^^^
...<3 lines>...
)
^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpx/_client.py", line 1629, in send
response = await self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<4 lines>...
)
^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpx/_client.py", line 1657, in _send_handling_auth
response = await self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<3 lines>...
)
^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpx/_client.py", line 1694, in _send_handling_redirects
response = await self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpx/_client.py", line 1730, in _send_single_request
response = await transport.handle_async_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpx/_transports/default.py", line 393, in handle_async_request
with map_httpcore_exceptions():
~~~~~~~~~~~~~~~~~~~~~~~^^
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/contextlib.py", line 162, in exit
self.gen.throw(value)
~~~~~~~~~~~~~~^^^^^^^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/httpx/_transports/default.py", line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: [Errno 8] nodename nor servname provided, or not known
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/rameshreddy/NLWeb/code/tools/db_load.py", line 853, in loadJsonToDB
embeddings = await batch_get_embeddings(batch_texts, provider, model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rameshreddy/NLWeb/code/embedding/embedding.py", line 192, in batch_get_embeddings
result = await asyncio.wait_for(
^^^^^^^^^^^^^^^^^^^^^^^
...<2 lines>...
)
^
File "/Library/Frameworks/Python.framework/Versions/3.13/lib/python3.13/asyncio/tasks.py", line 507, in wait_for
return await fut
^^^^^^^^^
File "/Users/rameshreddy/NLWeb/code/embedding/azure_oai_embedding.py", line 167, in get_azure_batch_embeddings
response = await client.embeddings.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<2 lines>...
)
^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/openai/resources/embeddings.py", line 245, in create
return await self._post(
^^^^^^^^^^^^^^^^^
...<10 lines>...
)
^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/openai/_base_client.py", line 1748, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/rameshreddy/NLWeb/myenv/lib/python3.13/site-packages/openai/_base_client.py", line 1522, in request
raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.
Processed 74/74 documents
Loading completed. Added 0 documents to the database.
Saved file with embeddings to ../data/json_with_embeddings/tmp29w4iusa.xml
Cleaned up temporary file: /var/folders/_g/xwl6wrh96vl2ksvwpzw9cml40000gp/T/tmp29w4iusa.xml
jennifermarsman commentedon Jun 16, 2025
@rameshcreddy it looks like your LLM configuration is pointing to Azure OpenAI, and if you set the key for an OpenAI endpoint, it won't work. Try changing the preferred_endpoint at the top of your config_llm.yaml file from azure_openai to openai.
jennifermarsman commentedon Jun 16, 2025
@rameshcreddy we just merged the check-connectivity script PR (PR #203) so if you git pull, you can get that. It's a simple script that you can run to ensure that your setup is working. Docs are at https://github.com/microsoft/NLWeb/blob/main/docs/nlweb-check-connectivity.md.
rameshcreddy commentedon Jun 16, 2025
thanks @jennifermarsman for all your support. you rock .. NLWeb is up and running !! Have a great day
jennifermarsman commentedon Jun 24, 2025
The fix for this underlying issue was made in commit 4d5db7a and merged in PR #220 .