Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ollama + litellm #24

Closed
louis030195 opened this issue Dec 24, 2023 · 2 comments
Closed

ollama + litellm #24

louis030195 opened this issue Dec 24, 2023 · 2 comments
Labels
good first issue Good for newcomers help wanted Extra attention is needed

Comments

@louis030195
Copy link
Collaborator

ollama does not support openai api (why?)

hack is to use:

https://docs.litellm.ai/docs/proxy/quick_start

@louis030195
Copy link
Collaborator Author

i still dont understand why people use ollama which seem suboptimal compared to vllm but people click stars anyway

@louis030195
Copy link
Collaborator Author

leaving a working docker compose if someone wants ollama:

version: '3.8'
services:
  postgres:
    container_name: pg
    image: postgres
    restart: always
    environment:
      POSTGRES_PASSWORD: secret
      POSTGRES_DB: mydatabase
    ports:
      - 5432:5432
    command: postgres -c 'max_connections=250'
    volumes:
      - ../assistants-core/src/migrations.sql:/docker-entrypoint-initdb.d/migrations.sql
      - ./pg-healthcheck.sh:/pg-healthcheck.sh
    healthcheck:
      test: ["CMD-SHELL", "/pg-healthcheck.sh"]
      interval: 30s
      timeout: 10s
      retries: 3
      start_period: 20s
  redis:
    container_name: redis 
    image: redis
    restart: always
    ports:
      - 6379:6379

  minio:
    container_name: minio1
    image: minio/minio
    restart: always
    ports:
      - 9000:9000
      - 9001:9001
    environment:
      MINIO_ROOT_USER: minioadmin
      MINIO_ROOT_PASSWORD: minioadmin
    command: server /data --console-address ":9001"

  ollama:
    image: ollama/ollama
    container_name: ollama
    ports:
      - 11434:11434
    profiles:
      - ollama
    volumes:
      - $HOME/.ollama/models:/usr/share/ollama/.ollama/models

  ollama-runner:
    container_name: ollama-runner
    image: docker
    env_file:
      - ../.env
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
    command: |
      sh -c "
      while ! docker exec ollama true 2>/dev/null; do
        echo 'Waiting for ollama...'
        sleep 1
      done
      echo 'Ollama is ready, running command...'
      docker exec ollama ollama run ${OLLAMA_MODEL:-phi}
      "
    depends_on:
      - ollama
    profiles:
      - ollama
  
  assistants:
    container_name: assistants
    image: ghcr.io/stellar-amenities/assistants/assistants:latest
    build:
      context: ..
      dockerfile: docker/Dockerfile
    ports:
      - 3000:3000
    depends_on:
      - postgres
      - redis
      - minio
    profiles:
      - api
    environment:
      - DATABASE_URL=postgres://postgres:secret@postgres/mydatabase
      - S3_ENDPOINT=http://minio:9000
      - REDIS_URL=redis://redis
    env_file:
      - ../.env
    healthcheck:
      test: ["CMD", "curl", "-f", "http://localhost:8080/health"]
      interval: 30s
      timeout: 10s
      retries: 3
      start_period: 20s

@louis030195 louis030195 added help wanted Extra attention is needed good first issue Good for newcomers labels Dec 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

1 participant