Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions .github/workflows/publish-docker-offline-amd64.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,14 @@ jobs:
steps:
- uses: actions/checkout@v4

- name: Free up disk space
run: |
sudo rm -rf /usr/share/dotnet
sudo rm -rf /usr/local/lib/android
sudo rm -rf /opt/ghc
sudo rm -rf /opt/hostedtoolcache/CodeQL
docker system prune -af

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

Expand Down
8 changes: 8 additions & 0 deletions .github/workflows/publish-docker-offline-arm64.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,14 @@ jobs:
steps:
- uses: actions/checkout@v4

- name: Free up disk space
run: |
sudo rm -rf /usr/share/dotnet
sudo rm -rf /usr/local/lib/android
sudo rm -rf /opt/ghc
sudo rm -rf /opt/hostedtoolcache/CodeQL
docker system prune -af

- name: Set up QEMU
uses: docker/setup-qemu-action@v3

Expand Down
36 changes: 35 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,10 +102,18 @@ docker run -p 8000:8000 ghcr.io/codelion/optillm:latest
2024-10-22 07:45:06,293 - INFO - Starting server with approach: auto
```

To use optillm without local inference and only as a proxy you can add the `-proxy` suffix.
**Available Docker image variants:**

- **Full image** (`latest`): Includes all dependencies for local inference and plugins
- **Proxy-only** (`latest-proxy`): Lightweight image without local inference capabilities
- **Offline** (`latest-offline`): Self-contained image with pre-downloaded models (spaCy) for fully offline operation

```bash
# Proxy-only (smallest)
docker pull ghcr.io/codelion/optillm:latest-proxy

# Offline (largest, includes pre-downloaded models)
docker pull ghcr.io/codelion/optillm:latest-offline
```

### Install from source
Expand All @@ -120,6 +128,32 @@ source .venv/bin/activate
pip install -r requirements.txt
```

## 🔒 SSL Configuration

OptILLM supports SSL certificate verification configuration for working with self-signed certificates or corporate proxies.

**Disable SSL verification (development only):**
```bash
# Command line
optillm --no-ssl-verify

# Environment variable
export OPTILLM_SSL_VERIFY=false
optillm
```

**Use custom CA certificate:**
```bash
# Command line
optillm --ssl-cert-path /path/to/ca-bundle.crt

# Environment variable
export OPTILLM_SSL_CERT_PATH=/path/to/ca-bundle.crt
optillm
```

⚠️ **Security Note**: Disabling SSL verification is insecure and should only be used in development. For production environments with custom CAs, use `--ssl-cert-path` instead. See [SSL_CONFIGURATION.md](SSL_CONFIGURATION.md) for details.

## Implemented techniques

| Approach | Slug | Description |
Expand Down
88 changes: 88 additions & 0 deletions SSL_CONFIGURATION.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
# SSL Certificate Configuration

OptILLM now supports SSL certificate verification configuration to work with self-signed certificates or corporate proxies.

## Usage

### Disable SSL Verification (Development Only)

**⚠️ WARNING: Only use this in development environments. Disabling SSL verification is insecure.**

#### Via Command Line
```bash
python optillm.py --no-ssl-verify
```

#### Via Environment Variable
```bash
export OPTILLM_SSL_VERIFY=false
python optillm.py
```

### Use Custom CA Certificate Bundle

For corporate environments with custom Certificate Authorities:

#### Via Command Line
```bash
python optillm.py --ssl-cert-path /path/to/ca-bundle.crt
```

#### Via Environment Variable
```bash
export OPTILLM_SSL_CERT_PATH=/path/to/ca-bundle.crt
python optillm.py
```

## Configuration Options

| Option | Environment Variable | Default | Description |
|--------|---------------------|---------|-------------|
| `--ssl-verify` / `--no-ssl-verify` | `OPTILLM_SSL_VERIFY` | `true` | Enable/disable SSL certificate verification |
| `--ssl-cert-path` | `OPTILLM_SSL_CERT_PATH` | `""` | Path to custom CA certificate bundle |

## Affected Components

SSL configuration applies to:
- **OpenAI API clients** (OpenAI, Azure, Cerebras)
- **HTTP plugins** (readurls, deep_research)
- **All external HTTPS connections**

## Examples

### Development with Self-Signed Certificate
```bash
# Disable SSL verification temporarily
python optillm.py --no-ssl-verify --base-url https://localhost:8443/v1
```

### Production with Corporate CA
```bash
# Use corporate certificate bundle
python optillm.py --ssl-cert-path /etc/ssl/certs/corporate-ca-bundle.crt
```

### Docker Environment
```bash
docker run -e OPTILLM_SSL_VERIFY=false optillm
```

## Security Notes

1. **Never disable SSL verification in production** - This makes your application vulnerable to man-in-the-middle attacks
2. **Use custom CA bundles instead** - For corporate environments, provide the proper CA certificate path
3. **Warning messages** - When SSL verification is disabled, OptILLM will log a warning message for security awareness

## Testing

Run the SSL configuration test suite:
```bash
python -m unittest tests.test_ssl_config -v
```

This validates:
- CLI argument parsing
- Environment variable configuration
- HTTP client SSL settings
- Plugin SSL propagation
- Warning messages
2 changes: 1 addition & 1 deletion optillm/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Version information
__version__ = "0.3.1"
__version__ = "0.3.2"

# Import from server module
from .server import (
Expand Down
9 changes: 7 additions & 2 deletions optillm/plugins/deep_research_plugin.py
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,9 @@ def create(self, **kwargs):
)
else:
# OpenAI or AzureOpenAI
# Get existing http_client to preserve SSL settings
existing_http_client = getattr(self.parent.client, '_client', None)

if 'Azure' in self.parent.client.__class__.__name__:
from openai import AzureOpenAI
# AzureOpenAI has different parameters
Expand All @@ -75,15 +78,17 @@ def create(self, **kwargs):
azure_endpoint=getattr(self.parent.client, 'azure_endpoint', None),
azure_ad_token_provider=getattr(self.parent.client, 'azure_ad_token_provider', None),
timeout=self.parent.timeout,
max_retries=self.parent.max_retries
max_retries=self.parent.max_retries,
http_client=existing_http_client
)
else:
from openai import OpenAI
custom_client = OpenAI(
api_key=self.parent.client.api_key,
base_url=getattr(self.parent.client, 'base_url', None),
timeout=self.parent.timeout,
max_retries=self.parent.max_retries
max_retries=self.parent.max_retries,
http_client=existing_http_client
)
return custom_client.chat.completions.create(**kwargs)
except Exception as e:
Expand Down
24 changes: 19 additions & 5 deletions optillm/plugins/readurls_plugin.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
import re
from typing import Tuple, List
from typing import Tuple, List, Optional
import requests
import os
from bs4 import BeautifulSoup
from urllib.parse import urlparse
from optillm import __version__
from optillm import __version__, server_config

SLUG = "readurls"

Expand All @@ -24,13 +24,27 @@ def extract_urls(text: str) -> List[str]:

return cleaned_urls

def fetch_webpage_content(url: str, max_length: int = 100000) -> str:
def fetch_webpage_content(url: str, max_length: int = 100000, verify_ssl: Optional[bool] = None, cert_path: Optional[str] = None) -> str:
try:
headers = {
'User-Agent': f'optillm/{__version__} (https://github.com/codelion/optillm)'
}

response = requests.get(url, headers=headers, timeout=10)

# Use SSL configuration from server_config if not explicitly provided
if verify_ssl is None:
verify_ssl = server_config.get('ssl_verify', True)
if cert_path is None:
cert_path = server_config.get('ssl_cert_path', '')

# Determine verify parameter for requests
if not verify_ssl:
verify = False
elif cert_path:
verify = cert_path
else:
verify = True

response = requests.get(url, headers=headers, timeout=10, verify=verify)
response.raise_for_status()

# Make a soup
Expand Down
47 changes: 40 additions & 7 deletions optillm/server.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,24 @@
conversation_logger = None

def get_config():
import httpx

API_KEY = None

# Create httpx client with SSL configuration
ssl_verify = server_config.get('ssl_verify', True)
ssl_cert_path = server_config.get('ssl_cert_path', '')

# Determine SSL verification setting
if not ssl_verify:
logger.warning("SSL certificate verification is DISABLED. This is insecure and should only be used for development.")
http_client = httpx.Client(verify=False)
elif ssl_cert_path:
logger.info(f"Using custom CA certificate bundle: {ssl_cert_path}")
http_client = httpx.Client(verify=ssl_cert_path)
else:
http_client = httpx.Client(verify=True)

if os.environ.get("OPTILLM_API_KEY"):
# Use local inference engine
from optillm.inference import create_inference_client
Expand All @@ -69,16 +86,16 @@ def get_config():
API_KEY = os.environ.get("CEREBRAS_API_KEY")
base_url = server_config['base_url']
if base_url != "":
default_client = Cerebras(api_key=API_KEY, base_url=base_url)
default_client = Cerebras(api_key=API_KEY, base_url=base_url, http_client=http_client)
else:
default_client = Cerebras(api_key=API_KEY)
default_client = Cerebras(api_key=API_KEY, http_client=http_client)
elif os.environ.get("OPENAI_API_KEY"):
API_KEY = os.environ.get("OPENAI_API_KEY")
base_url = server_config['base_url']
if base_url != "":
default_client = OpenAI(api_key=API_KEY, base_url=base_url)
default_client = OpenAI(api_key=API_KEY, base_url=base_url, http_client=http_client)
else:
default_client = OpenAI(api_key=API_KEY)
default_client = OpenAI(api_key=API_KEY, http_client=http_client)
elif os.environ.get("AZURE_OPENAI_API_KEY"):
API_KEY = os.environ.get("AZURE_OPENAI_API_KEY")
API_VERSION = os.environ.get("AZURE_API_VERSION")
Expand All @@ -88,6 +105,7 @@ def get_config():
api_key=API_KEY,
api_version=API_VERSION,
azure_endpoint=AZURE_ENDPOINT,
http_client=http_client
)
else:
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
Expand All @@ -96,7 +114,8 @@ def get_config():
default_client = AzureOpenAI(
api_version=API_VERSION,
azure_endpoint=AZURE_ENDPOINT,
azure_ad_token_provider=token_provider
azure_ad_token_provider=token_provider,
http_client=http_client
)
else:
# Import the LiteLLM wrapper
Expand Down Expand Up @@ -152,7 +171,7 @@ def count_reasoning_tokens(text: str, tokenizer=None) -> int:

# Server configuration
server_config = {
'approach': 'none',
'approach': 'none',
'mcts_simulations': 2,
'mcts_exploration': 0.2,
'mcts_depth': 1,
Expand All @@ -167,6 +186,8 @@ def count_reasoning_tokens(text: str, tokenizer=None) -> int:
'return_full_response': False,
'port': 8000,
'log': 'info',
'ssl_verify': True,
'ssl_cert_path': '',
}

# List of known approaches
Expand Down Expand Up @@ -977,7 +998,19 @@ def parse_args():
base_url_default = os.environ.get("OPTILLM_BASE_URL", "")
parser.add_argument("--base-url", "--base_url", dest="base_url", type=str, default=base_url_default,
help="Base url for OpenAI compatible endpoint")


# SSL configuration arguments
ssl_verify_default = os.environ.get("OPTILLM_SSL_VERIFY", "true").lower() in ("true", "1", "yes")
parser.add_argument("--ssl-verify", dest="ssl_verify", action="store_true" if ssl_verify_default else "store_false",
default=ssl_verify_default,
help="Enable SSL certificate verification (default: True)")
parser.add_argument("--no-ssl-verify", dest="ssl_verify", action="store_false",
help="Disable SSL certificate verification")

ssl_cert_path_default = os.environ.get("OPTILLM_SSL_CERT_PATH", "")
parser.add_argument("--ssl-cert-path", dest="ssl_cert_path", type=str, default=ssl_cert_path_default,
help="Path to custom CA certificate bundle for SSL verification")

# Use the function to get the default path
default_config_path = get_config_path()

Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"

[project]
name = "optillm"
version = "0.3.1"
version = "0.3.2"
description = "An optimizing inference proxy for LLMs."
readme = "README.md"
license = "Apache-2.0"
Expand Down
Loading