Skip to content
This repository was archived by the owner on Jul 16, 2025. It is now read-only.
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
86 changes: 44 additions & 42 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@

PHP library for building LLM-based and AI-based features and applications.

This library is not a stable yet, but still rather experimental. Feel free to try it out, give feedback, ask questions,
contribute or share your use cases. Abstractions, concepts and interfaces are not final and potentially subject of change.
This library is not stable yet, but still rather experimental. Feel free to try it out, give feedback, ask questions,
contribute, or share your use cases. Abstractions, concepts, and interfaces are not final and potentially subject of change.

## Requirements

Expand All @@ -21,7 +21,7 @@ When using Symfony Framework, check out the integration bundle [php-llm/llm-chai

## Examples

See [examples](examples) folder to run example implementations using this library.
See [the examples folder](examples) to run example implementations using this library.
Depending on the example you need to export different environment variables
for API keys or deployment configurations or create a `.env.local` based on `.env` file.

Expand All @@ -34,10 +34,10 @@ For a more sophisticated demo, see the [Symfony Demo Application](https://github
### Models & Platforms

LLM Chain categorizes two main types of models: **Language Models** and **Embeddings Models**. On top of that, there are
other models, like text-to-speech, image generation or classification models that are also supported.
other models, like text-to-speech, image generation, or classification models that are also supported.

Language Models, like GPT, Claude and Llama, as essential centerpiece of LLM applications
and Embeddings Models as supporting models to provide vector representations of text.
Language Models, like GPT, Claude, and Llama, as essential centerpiece of LLM applications
and Embeddings Models as supporting models to provide vector representations of a text.

Those models are provided by different **platforms**, like OpenAI, Azure, Google, Replicate, and others.

Expand Down Expand Up @@ -112,8 +112,8 @@ The second parameter of the `call` method is an array of options, which can be u
chain, like `stream`, `output_structure`, or `response_format`. This behavior is a combination of features provided by
the underlying model and platform, or additional features provided by processors registered to the chain.

Options design for additional features provided by LLM Chain can be found in this documentation. For model and platform
specific options, please refer to the respective documentation.
Options designed for additional features provided by LLM Chain can be found in this documentation. For model- and
platform-specific options, please refer to the respective documentation.

```php
// Chain and MessageBag instantiation
Expand All @@ -126,13 +126,15 @@ $response = $chain->call($messages, [

#### Code Examples

1. **Anthropic's Claude**: [chat-claude-anthropic.php](examples/chat-claude-anthropic.php)
1. **OpenAI's GPT with Azure**: [chat-gpt-azure.php](examples/chat-gpt-azure.php)
1. **OpenAI's GPT**: [chat-gpt-openai.php](examples/chat-gpt-openai.php)
1. **OpenAI's o1**: [chat-o1-openai.php](examples/chat-o1-openai.php)
1. **Meta's Llama with Ollama**: [chat-llama-ollama.php](examples/chat-llama-ollama.php)
1. **Meta's Llama with Replicate**: [chat-llama-replicate.php](examples/chat-llama-replicate.php)
1. **Google's Gemini with OpenRouter**: [chat-gemini-openrouter.php](examples/chat-gemini-openrouter.php)
1. [Anthropic's Claude](examples/anthropic/chat.php)
1. [OpenAI's GPT with Azure](examples/azure/chat-gpt.php)
1. [OpenAI's GPT](examples/openai/chat.php)
1. [OpenAI's o1](examples/openai/chat-o1.php)
1. [Meta's Llama with Azure](examples/azure/chat-llama.php)
1. [Meta's Llama with Ollama](examples/ollama/chat-llama.php)
1. [Meta's Llama with Replicate](examples/replicate/chat-llama.php)
1. [Google's Gemini with Google](examples/google/chat.php)
1. [Google's Gemini with OpenRouter](examples/openrouter/chat-gemini.php)

### Tools

Expand Down Expand Up @@ -340,14 +342,14 @@ $eventDispatcher->addListener(ToolCallsExecuted::class, function (ToolCallsExecu

#### Code Examples (with built-in tools)

1. **Brave Tool**: [toolbox-brave.php](examples/toolbox-brave.php)
1. **Clock Tool**: [toolbox-clock.php](examples/toolbox-clock.php)
1. **Crawler Tool**: [toolbox-brave.php](examples/toolbox-brave.php)
1. **SerpAPI Tool**: [toolbox-serpapi.php](examples/toolbox-serpapi.php)
1. **Tavily Tool**: [toolbox-tavily.php](examples/toolbox-tavily.php)
1. **Weather Tool with Event Listener**: [toolbox-weather-event.php](examples/toolbox-weather-event.php)
1. **Wikipedia Tool**: [toolbox-wikipedia.php](examples/toolbox-wikipedia.php)
1. **YouTube Transcriber Tool**: [toolbox-youtube.php](examples/toolbox-youtube.php) (with streaming)
1. [Brave Tool](examples/toolbox/brave.php)
1. [Clock Tool](examples/toolbox/clock.php)
1. [Crawler Tool](examples/toolbox/brave.php)
1. [SerpAPI Tool](examples/toolbox/serpapi.php)
1. [Tavily Tool](examples/toolbox/tavily.php)
1. [Weather Tool with Event Listener](examples/toolbox/weather-event.php)
1. [Wikipedia Tool](examples/anthropic/toolbox.php)
1. [YouTube Transcriber Tool](examples/openai/toolbox.php) (with streaming)

### Document Embedding, Vector Stores & Similarity Search (RAG)

Expand Down Expand Up @@ -383,7 +385,7 @@ foreach ($entities as $entity) {
$documents[] = new TextDocument(
id: $entity->getId(), // UUID instance
content: $entity->toString(), // Text representation of relevant data for embedding
metadata: new Metadata($entity->toArray()), // Array representation of entity to be stored additionally
metadata: new Metadata($entity->toArray()), // Array representation of an entity to be stored additionally
);
}
```
Expand Down Expand Up @@ -421,8 +423,8 @@ $response = $chain->call($messages);

#### Code Examples

1. **MongoDB Store**: [store-mongodb-similarity-search.php](examples/store-mongodb-similarity-search.php)
1. **Pinecone Store**: [store-pinecone-similarity-search.php](examples/store-pinecone-similarity-search.php)
1. [MongoDB Store](examples/store-mongodb-similarity-search.php)
1. [Pinecone Store](examples/store-pinecone-similarity-search.php)

#### Supported Stores

Expand All @@ -442,7 +444,7 @@ by features like **Structured Output** or providing a **Response Format**.

#### PHP Classes as Output

LLM Chain support that use-case by abstracting the hustle of defining and providing schemas to the LLM and converting
LLM Chain supports that use-case by abstracting the hustle of defining and providing schemas to the LLM and converting
the response back to PHP objects.

To achieve this, a specific chain processor needs to be registered:
Expand Down Expand Up @@ -505,8 +507,8 @@ dump($response->getContent()); // returns an array

#### Code Examples

1. **Structured Output** (PHP class): [structured-output-math.php](examples/structured-output-math.php)
1. **Structured Output** (array): [structured-output-clock.php](examples/structured-output-clock.php)
1. [Structured Output with PHP class)](examples/openai/structured-output-math.php)
1. [Structured Output with array](examples/openai/structured-output-clock.php)

### Response Streaming

Expand Down Expand Up @@ -539,8 +541,8 @@ needs to be used.

#### Code Examples

1. **Streaming Claude**: [stream-claude-anthropic.php](examples/stream-claude-anthropic.php)
1. **Streaming GPT**: [stream-gpt-openai.php](examples/stream-gpt-openai.php)
1. [Streaming Claude](examples/anthropic/stream.php)
1. [Streaming GPT](examples/openai/stream.php)

### Image Processing

Expand All @@ -567,8 +569,8 @@ $response = $chain->call($messages);

#### Code Examples

1. **Image Description**: [image-describer-binary.php](examples/image-describer-binary.php) (with binary file)
1. **Image Description**: [image-describer-url.php](examples/image-describer-url.php) (with URL)
1. [Binary Image Input with GPT](examples/openai/image-input-binary.php)
1. [Image URL Input with GPT](examples/openai/image-input-url.php)

### Audio Processing

Expand All @@ -592,11 +594,11 @@ $response = $chain->call($messages);

#### Code Examples

1. **Audio Description**: [audio-describer.php](examples/audio-describer.php)
1. [Audio Input with GPT](examples/openai/audio-input.php)

### Embeddings

Creating embeddings of word, sentences or paragraphs is a typical use case around the interaction with LLMs and
Creating embeddings of word, sentences, or paragraphs is a typical use case around the interaction with LLMs, and
therefore LLM Chain implements a `EmbeddingsModel` interface with various models, see above.

The standalone usage results in an `Vector` instance:
Expand All @@ -615,8 +617,8 @@ dump($vectors[0]->getData()); // Array of float values

#### Code Examples

1. **OpenAI's Emebddings**: [embeddings-openai.php](examples/embeddings-openai.php)
1. **Voyage's Embeddings**: [embeddings-voyage.php](examples/embeddings-voyage.php)
1. [OpenAI's Emebddings](examples/openai/embeddings.php)
1. [Voyage's Embeddings](examples/voyage/embeddings.php)

### Parallel Platform Calls

Expand All @@ -639,11 +641,11 @@ foreach ($responses as $response) {

#### Code Examples

1. **Parallel GPT Calls**: [parallel-chat-gpt.php](examples/parallel-chat-gpt.php)
1. **Parallel Embeddings Calls**: [parallel-embeddings.php](examples/parallel-embeddings.php)
1. [Parallel GPT Calls](examples/parallel-chat-gpt.php)
1. [Parallel Embeddings Calls](examples/parallel-embeddings.php)

> [!NOTE]
> Please be aware that some embeddings models also support batch processing out of the box.
> Please be aware that some embedding models also support batch processing out of the box.

### Input & Output Processing

Expand Down Expand Up @@ -733,7 +735,7 @@ final class MyProcessor implements OutputProcessor, ChainAwareProcessor
## HuggingFace

LLM Chain comes out of the box with an integration for [HuggingFace](https://huggingface.co/) which is a platform for
hosting and sharing all kind of models, including LLMs, embeddings, image generation and classification models.
hosting and sharing all kinds of models, including LLMs, embeddings, image generation, and classification models.

You can just instantiate the Platform with the corresponding HuggingFace bridge and use it with the `task` option:
```php
Expand Down Expand Up @@ -800,7 +802,7 @@ echo $response->getContent().PHP_EOL;

#### Code Examples

1. [Text Generation with TransformersPHP](examples/transformers-text-generation.php)
1. [Text Generation with TransformersPHP](examples/transformers/text-generation.php)

## Contributions

Expand Down
4 changes: 2 additions & 2 deletions example
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ $app = (new SingleCommandApplication('LLM Chain Example Runner'))
: (1 === $run['process']->getExitCode() || $emptyOutput ? '<error>Failed</error>' : '<comment>Skipped</comment>');
}

$table->addRow([$example->getFilename(), $state, $output]);
$table->addRow([$example->getRelativePathname(), $state, $output]);
}
$table->render();
};
Expand Down Expand Up @@ -87,7 +87,7 @@ $app = (new SingleCommandApplication('LLM Chain Example Runner'))

foreach ($exampleRuns as $run) {
if (!$run['process']->isSuccessful()) {
$io->section('Error in ' . $run['example']->getFilename());
$io->section('Error in ' . $run['example']->getRelativePathname());
$io->text($run['process']->getOutput());
$io->text($run['process']->getErrorOutput());
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
use PhpLlm\LlmChain\Model\Message\MessageBag;
use Symfony\Component\Dotenv\Dotenv;

require_once dirname(__DIR__).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__).'/.env');
require_once dirname(__DIR__, 2).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__, 2).'/.env');

if (empty($_ENV['ANTHROPIC_API_KEY'])) {
echo 'Please set the ANTHROPIC_API_KEY environment variable.'.PHP_EOL;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
use PhpLlm\LlmChain\Model\Message\MessageBag;
use Symfony\Component\Dotenv\Dotenv;

require_once dirname(__DIR__).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__).'/.env');
require_once dirname(__DIR__, 2).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__, 2).'/.env');

if (empty($_ENV['ANTHROPIC_API_KEY'])) {
echo 'Please set the ANTHROPIC_API_KEY environment variable.'.PHP_EOL;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,8 @@
use Symfony\Component\Dotenv\Dotenv;
use Symfony\Component\HttpClient\HttpClient;

require_once dirname(__DIR__).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__).'/.env');
require_once dirname(__DIR__, 2).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__, 2).'/.env');

if (empty($_ENV['ANTHROPIC_API_KEY'])) {
echo 'Please set the ANTHROPIC_API_KEY environment variable.'.PHP_EOL;
Expand Down
4 changes: 2 additions & 2 deletions examples/chat-gpt-azure.php → examples/azure/chat-gpt.php
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
use PhpLlm\LlmChain\Model\Message\MessageBag;
use Symfony\Component\Dotenv\Dotenv;

require_once dirname(__DIR__).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__).'/.env');
require_once dirname(__DIR__, 2).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__, 2).'/.env');

if (empty($_ENV['AZURE_OPENAI_BASEURL']) || empty($_ENV['AZURE_OPENAI_DEPLOYMENT']) || empty($_ENV['AZURE_OPENAI_VERSION']) || empty($_ENV['AZURE_OPENAI_KEY'])
) {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
use PhpLlm\LlmChain\Model\Message\MessageBag;
use Symfony\Component\Dotenv\Dotenv;

require_once dirname(__DIR__).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__).'/.env');
require_once dirname(__DIR__, 2).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__, 2).'/.env');

if (empty($_ENV['AZURE_LLAMA_BASEURL']) || empty($_ENV['AZURE_LLAMA_KEY'])) {
echo 'Please set the AZURE_LLAMA_BASEURL and AZURE_LLAMA_KEY environment variable.'.PHP_EOL;
Expand Down
4 changes: 2 additions & 2 deletions examples/chat-gemini-google.php → examples/google/chat.php
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
use PhpLlm\LlmChain\Model\Message\MessageBag;
use Symfony\Component\Dotenv\Dotenv;

require_once dirname(__DIR__).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__).'/.env');
require_once dirname(__DIR__, 2).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__, 2).'/.env');

if (empty($_ENV['GOOGLE_API_KEY'])) {
echo 'Please set the GOOGLE_API_KEY environment variable.'.PHP_EOL;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@
use PhpLlm\LlmChain\Model\Message\MessageBag;
use Symfony\Component\Dotenv\Dotenv;

require_once dirname(__DIR__).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__).'/.env');
require_once dirname(__DIR__, 2).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__, 2).'/.env');

if (empty($_ENV['GOOGLE_API_KEY'])) {
echo 'Please set the GOOGLE_API_KEY environment variable.'.PHP_EOL;
Expand All @@ -24,7 +24,7 @@
Message::forSystem('You are an image analyzer bot that helps identify the content of images.'),
Message::ofUser(
'Describe the image as a comedian would do it.',
Image::fromFile(dirname(__DIR__).'/tests/Fixture/image.jpg'),
Image::fromFile(dirname(__DIR__, 2).'/tests/Fixture/image.jpg'),
),
);
$response = $chain->call($messages);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
use PhpLlm\LlmChain\Model\Message\MessageBag;
use Symfony\Component\Dotenv\Dotenv;

require_once dirname(__DIR__).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__).'/.env');
require_once dirname(__DIR__, 2).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__, 2).'/.env');

if (empty($_ENV['GOOGLE_API_KEY'])) {
echo 'Please set the GOOGLE_API_KEY environment variable.'.PHP_EOL;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
use PhpLlm\LlmChain\Model\Message\MessageBag;
use Symfony\Component\Dotenv\Dotenv;

require_once dirname(__DIR__).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__).'/.env');
require_once dirname(__DIR__, 2).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__, 2).'/.env');

if (empty($_ENV['OLLAMA_HOST_URL'])) {
echo 'Please set the OLLAMA_HOST_URL environment variable.'.PHP_EOL;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@
use PhpLlm\LlmChain\Model\Message\MessageBag;
use Symfony\Component\Dotenv\Dotenv;

require_once dirname(__DIR__).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__).'/.env');
require_once dirname(__DIR__, 2).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__, 2).'/.env');

if (empty($_ENV['OPENAI_API_KEY'])) {
echo 'Please set the OPENAI_API_KEY environment variable.'.PHP_EOL;
Expand All @@ -23,7 +23,7 @@
$messages = new MessageBag(
Message::ofUser(
'What is this recording about?',
Audio::fromFile(dirname(__DIR__).'/tests/Fixture/audio.mp3'),
Audio::fromFile(dirname(__DIR__, 2).'/tests/Fixture/audio.mp3'),
),
);
$response = $chain->call($messages);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@
use PhpLlm\LlmChain\Model\Message\Content\Audio;
use Symfony\Component\Dotenv\Dotenv;

require_once dirname(__DIR__).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__).'/.env');
require_once dirname(__DIR__, 2).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__, 2).'/.env');

if (empty($_ENV['OPENAI_API_KEY'])) {
echo 'Please set the OPENAI_API_KEY environment variable.'.PHP_EOL;
Expand All @@ -15,7 +15,7 @@

$platform = PlatformFactory::create($_ENV['OPENAI_API_KEY']);
$model = new Whisper();
$file = Audio::fromFile(dirname(__DIR__).'/tests/Fixture/audio.mp3');
$file = Audio::fromFile(dirname(__DIR__, 2).'/tests/Fixture/audio.mp3');

$response = $platform->request($model, $file);

Expand Down
4 changes: 2 additions & 2 deletions examples/chat-o1-openai.php → examples/openai/chat-o1.php
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
use PhpLlm\LlmChain\Model\Message\MessageBag;
use Symfony\Component\Dotenv\Dotenv;

require_once dirname(__DIR__).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__).'/.env');
require_once dirname(__DIR__, 2).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__, 2).'/.env');

if (empty($_ENV['OPENAI_API_KEY'])) {
echo 'Please set the OPENAI_API_KEY environment variable.'.PHP_EOL;
Expand Down
4 changes: 2 additions & 2 deletions examples/chat-gpt-openai.php → examples/openai/chat.php
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,8 @@
use PhpLlm\LlmChain\Model\Message\MessageBag;
use Symfony\Component\Dotenv\Dotenv;

require_once dirname(__DIR__).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__).'/.env');
require_once dirname(__DIR__, 2).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__, 2).'/.env');

if (empty($_ENV['OPENAI_API_KEY'])) {
echo 'Please set the OPENAI_API_KEY environment variable.'.PHP_EOL;
Expand Down
Loading