Skip to content

Docker Image missing dependencies #1519

@nicholasamiller

Description

@nicholasamiller

Describe the issue as clearly as possible:

docker run -p 8000:8000 outlinesdev/outlines --model="microsoft/Phi-3-mini-4k-instruct"   pwsh   35  11:48:57 
Traceback (most recent call last):
File "/usr/local/lib/python3.10/runpy.py", line 187, in _run_module_as_main
mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
File "/usr/local/lib/python3.10/runpy.py", line 110, in _get_module_details
import(pkg_name)
File "/outlines/outlines/init.py", line 3, in
import outlines.generate
File "/outlines/outlines/generate/init.py", line 2, in
from .cfg import cfg
File "/outlines/outlines/generate/cfg.py", line 7, in
from outlines.models import LlamaCpp, OpenAI, TransformersVision
File "/outlines/outlines/models/init.py", line 11, in
from .exllamav2 import ExLlamaV2Model, exl2
File "/outlines/outlines/models/exllamav2.py", line 4, in
import torch
ModuleNotFoundError: No module named 'torch'

Steps/code to reproduce the bug:

Follow instructions in documentation:


Alternative Method: Via Docker
You can install and run the server with Outlines' official Docker image using the command


docker run -p 8000:8000 outlinesdev/outlines --model="microsoft/Phi-3-mini-4k-instruct"

Expected result:

Container runs without error.

Error message:

docker run -p 8000:8000 outlinesdev/outlines --model="microsoft/Phi-3-mini-4k-instruct"                   pwsh   35  11:48:57 
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/runpy.py", line 187, in _run_module_as_main
    mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
  File "/usr/local/lib/python3.10/runpy.py", line 110, in _get_module_details
    __import__(pkg_name)
  File "/outlines/outlines/__init__.py", line 3, in <module>
    import outlines.generate
  File "/outlines/outlines/generate/__init__.py", line 2, in <module>
    from .cfg import cfg
  File "/outlines/outlines/generate/cfg.py", line 7, in <module>
    from outlines.models import LlamaCpp, OpenAI, TransformersVision
  File "/outlines/outlines/models/__init__.py", line 11, in <module>
    from .exllamav2 import ExLlamaV2Model, exl2
  File "/outlines/outlines/models/exllamav2.py", line 4, in <module>
    import torch
ModuleNotFoundError: No module named 'torch'

Outlines/Python version information:

Version information

Details ``` (command output here) ```

Context for the issue:

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions