Skip to content

Corion/AI-Ollama-Client

Repository files navigation

Windows MacOS Linux

NAME

AI::Ollama::Client - Client for AI::Ollama

SYNOPSIS

use 5.020;
use AI::Ollama::Client;

my $client = AI::Ollama::Client->new(
    server => 'https://example.com/',
);
my $res = $client->someMethod()->get;
say $res;

METHODS

checkBlob

my $res = $client->checkBlob()->get;

Check to see if a blob exists on the Ollama server which is useful when creating models.

createBlob

my $res = $client->createBlob()->get;

Create a blob from a file. Returns the server file path.

generateChatCompletion

my $res = $client->generateChatCompletion()->get;

Generate the next message in a chat with a provided model.

Returns a AI::Ollama::GenerateChatCompletionResponse.

copyModel

my $res = $client->copyModel()->get;

Creates a model with another name from an existing model.

createModel

my $res = $client->createModel()->get;

Create a model from a Modelfile.

Returns a AI::Ollama::CreateModelResponse.

deleteModel

my $res = $client->deleteModel()->get;

Delete a model and its data.

generateEmbedding

my $res = $client->generateEmbedding()->get;

Generate embeddings from a model.

Returns a AI::Ollama::GenerateEmbeddingResponse.

generateCompletion

use Future::Utils 'repeat';
my $responses = $client->generateCompletion();
repeat {
    my ($res) = $responses->shift;
    if( $res ) {
        my $str = $res->get;
        say $str;
    }

    Future::Mojo->done( defined $res );
} until => sub($done) { $done->get };

Generate a response for a given prompt with a provided model.

Returns a AI::Ollama::GenerateCompletionResponse.

pullModel

my $res = $client->pullModel(
    name => 'llama',
)->get;

Download a model from the ollama library.

Returns a AI::Ollama::PullModelResponse.

pushModel

my $res = $client->pushModel()->get;

Upload a model to a model library.

Returns a AI::Ollama::PushModelResponse.

showModelInfo

my $info = $client->showModelInfo()->get;
say $info->modelfile;

Show details about a model including modelfile, template, parameters, license, and system prompt.

Returns a AI::Ollama::ModelInfo.

listModels

my $info = $client->listModels()->get;
for my $model ($info->models->@*) {
    say $model->model; # llama2:latest
}

List models that are available locally.

Returns a AI::Ollama::ModelsResponse.

About

Client for Ollama models

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages