Skip to content

Fix docker failing to start llama-stack container [fireworks-ai, inference] #3273

@slekkala1

Description

@slekkala1

System Info

This is not local env
Issue can be seen in github actions https://github.com/llamastack/llama-stack-ops/actions/runs/17253649880

Information

  • The official example scripts
  • My own modified scripts

🐛 Describe the bug

https://github.com/llamastack/llama-stack-ops/actions/runs/17253649880

Fix the issue with open ai package incompatibilty introduced through new dependency of fireworks-ai==0.19.18->reward-kit by pinning to fireworks older version that doesnt pull in reward-kit

Error logs

    impl = await instantiate_provider(provider, deps, inner_impls, dist_registry, run_config, policy)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/llama-stack-source/llama_stack/core/resolver.py", line 383, in instantiate_provider
    impl = await fn(*args)
           ^^^^^^^^^^^^^^^
  File "/app/llama-stack-source/llama_stack/providers/remote/inference/fireworks/__init__.py", line 17, in get_adapter_impl
    from .fireworks import FireworksInferenceAdapter
  File "/app/llama-stack-source/llama_stack/providers/remote/inference/fireworks/fireworks.py", line 10, in <module>
    from fireworks.client import Fireworks
  File "/usr/local/lib/python3.12/site-packages/fireworks/__init__.py", line 7, in <module>
    from reward_kit import reward_function
  File "/usr/local/lib/python3.12/site-packages/reward_kit/__init__.py", line 13, in <module>
    from .adapters.braintrust import reward_fn_to_scorer, scorer_to_reward_fn
  File "/usr/local/lib/python3.12/site-packages/reward_kit/adapters/braintrust.py", line 6, in <module>
    from ..integrations.braintrust import reward_fn_to_scorer, scorer_to_reward_fn
  File "/usr/local/lib/python3.12/site-packages/reward_kit/integrations/__init__.py", line 3, in <module>
    from .braintrust import reward_fn_to_scorer, scorer_to_reward_fn
  File "/usr/local/lib/python3.12/site-packages/reward_kit/integrations/braintrust.py", line 5, in <module>
    from reward_kit.models import EvaluateResult, Message
  File "/usr/local/lib/python3.12/site-packages/reward_kit/models.py", line 4, in <module>
    from openai.types.chat.chat_completion_message import (
ImportError: cannot import name 'ChatCompletionMessageToolCall' from 'openai.types.chat.chat_completion_message' (/usr/local/lib/python3.12/site-packages/openai/types/chat/chat_completion_message.py). Did you mean: 'ChatCompletionMessageToolCallUnion'?
Exception ignored in: <generator object inference_recording at 0x7f1e1162e2f0>```

### Expected behavior

No failure why starting container https://github.com/llamastack/llama-stack-ops/actions/runs/17224110550

Sub-issues

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions