Skip to content

run inference-perf --config_file config.yml failed #127

Open
@liyuerich

Description

@liyuerich

1 update config.yml with my current used model_name and base_URL(all are located in local cloud)

2 run command according to readme.md
inference-perf --config_file config.yml --log-level DEBUG

OSError: We couldn't connect to 'https://huggingface.co' to load the files, and couldn't find them in the cached files.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/opt/conda/envs/py312/bin/inference-perf", line 8, in
sys.exit(main_cli())
^^^^^^^^^^
File "/opt/conda/envs/py312/lib/python3.12/site-packages/inference_perf/main.py", line 122, in main_cli
raise Exception("Tokenizer initialization failed") from e

not sure why it connects "https://huggingface.co" ?

Metadata

Metadata

Assignees

No one assigned

    Labels

    kind/cleanupCategorizes issue or PR as related to cleaning up code, process, or technical debt.kind/supportCategorizes issue or PR as a support question.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions