Skip to content

Commit

Permalink
misc/py-litellm: add port: Call all LLM APIs using the OpenAI format
Browse files Browse the repository at this point in the history
Call all LLM APIs using the OpenAI format [Bedrock, Huggingface,
VertexAI, TogetherAI, Azure, OpenAI, etc.]

LiteLLM manages:
- Translate inputs to provider's completion, embedding, and
  image_generation endpoints
- Consistent output, text responses will always be available at
  ['choices'][0]['message']['content']
- Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI)
  - Router
- Track spend & set budgets per project OpenAI Proxy Server

WWW: https://github.com/BerriAI/litellm
  • Loading branch information
tagattie committed Feb 12, 2024
1 parent e3dfc2f commit 730828c
Show file tree
Hide file tree
Showing 5 changed files with 69 additions and 0 deletions.
1 change: 1 addition & 0 deletions misc/Makefile
Expand Up @@ -435,6 +435,7 @@
SUBDIR += py-lazrs
SUBDIR += py-lightgbm
SUBDIR += py-lightning-utilities
SUBDIR += py-litellm
SUBDIR += py-log_symbols
SUBDIR += py-mffpy
SUBDIR += py-mmcv
Expand Down
46 changes: 46 additions & 0 deletions misc/py-litellm/Makefile
@@ -0,0 +1,46 @@
PORTNAME= litellm
DISTVERSION= 1.23.9
CATEGORIES= misc python
MASTER_SITES= PYPI
PKGNAMEPREFIX= ${PYTHON_PKGNAMEPREFIX}

MAINTAINER= tagattie@FreeBSD.org
COMMENT= Call all LLM APIs using the OpenAI format
WWW= https://github.com/BerriAI/litellm

LICENSE= MIT
LICENSE_FILE= ${WRKSRC}/LICENSE

BUILD_DEPENDS= ${PYTHON_PKGNAMEPREFIX}poetry-core>0:devel/py-poetry-core@${PY_FLAVOR} \
${PYTHON_PKGNAMEPREFIX}wheel>0:devel/py-wheel@${PY_FLAVOR}
RUN_DEPENDS= ${PYTHON_PKGNAMEPREFIX}openai>=1.0.0:misc/py-openai@${PY_FLAVOR} \
${PYTHON_PKGNAMEPREFIX}python-dotenv>=0.2.0:www/py-python-dotenv@${PY_FLAVOR} \
${PYTHON_PKGNAMEPREFIX}tiktoken>=0.4.0:textproc/py-tiktoken@${PY_FLAVOR} \
${PYTHON_PKGNAMEPREFIX}importlib-metadata>=6.8.0:devel/py-importlib-metadata@${PY_FLAVOR} \
${PYTHON_PKGNAMEPREFIX}tokenizers>0:textproc/py-tokenizers@${PY_FLAVOR} \
${PYTHON_PKGNAMEPREFIX}click>0:devel/py-click@${PY_FLAVOR} \
${PYTHON_PKGNAMEPREFIX}Jinja2>=3.1.2<4.0.0:devel/py-Jinja2@${PY_FLAVOR} \
${PYTHON_PKGNAMEPREFIX}aiohttp>0:www/py-aiohttp@${PY_FLAVOR} \
${PYTHON_PKGNAMEPREFIX}requests>=2.31.0<3.0.0:www/py-requests@${PY_FLAVOR}

USES= python shebangfix
USE_PYTHON= autoplist pep517

REINPLACE_ARGS= -i ''
NO_ARCH= yes

PORTDOCS= README.md

OPTIONS_DEFINE= DOCS

post-patch:
@${REINPLACE_CMD} -e 's|%%PYTHON_CMD%%|${PYTHON_CMD}|' \
${WRKSRC}/litellm/proxy/start.sh
@${FIND} ${WRKSRC}/litellm/proxy -type f \
\( -name '*.orig' -o -name '*.bak' \) -delete

post-install-DOCS-on:
@${MKDIR} ${STAGEDIR}${DOCSDIR}
${INSTALL_MAN} ${PORTDOCS:S|^|${WRKSRC}/|} ${STAGEDIR}${DOCSDIR}

.include <bsd.port.mk>
3 changes: 3 additions & 0 deletions misc/py-litellm/distinfo
@@ -0,0 +1,3 @@
TIMESTAMP = 1707722656
SHA256 (litellm-1.23.9.tar.gz) = 0c1e0e56f4d1d9c8a33da09d6736bde9b21a8ea324db8c05cc3de65c6b4fad7d
SIZE (litellm-1.23.9.tar.gz) = 3139242
8 changes: 8 additions & 0 deletions misc/py-litellm/files/patch-litellm_proxy_start.sh
@@ -0,0 +1,8 @@
--- litellm/proxy/start.sh.orig 2024-02-11 03:13:21 UTC
+++ litellm/proxy/start.sh
@@ -1,2 +1,2 @@
-#!/bin/bash
-python3 proxy_cli.py
\ No newline at end of file
+#!/bin/sh
+%%PYTHON_CMD%% proxy_cli.py
11 changes: 11 additions & 0 deletions misc/py-litellm/pkg-descr
@@ -0,0 +1,11 @@
Call all LLM APIs using the OpenAI format [Bedrock, Huggingface,
VertexAI, TogetherAI, Azure, OpenAI, etc.]

LiteLLM manages:
- Translate inputs to provider's completion, embedding, and
image_generation endpoints
- Consistent output, text responses will always be available at
['choices'][0]['message']['content']
- Retry/fallback logic across multiple deployments (e.g. Azure/OpenAI)
- Router
- Track spend & set budgets per project OpenAI Proxy Server

0 comments on commit 730828c

Please sign in to comment.