-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does it support the Ollama service? #23
Comments
Hi! Thank you for your question and your patience! Unfortunately the plugin NppOpenAI doesn't support Ollama directly, but you can use a PHP proxy to handle NppOpenAI requests and Ollama responses. For example you can create an <?php
header('Content-Type: application/json; charset=utf-8');
// Set up some variables
$ollama_url = 'http://localhost:11434/v1/chat/completions';
$postfields = [ # 'prompt' and (optional) 'system' will be added later
'model' => 'latest', # PLEASE SET UP MODEL HERE! (additional models: 'llama2', 'orca-mini:3b-q4_1', 'llama2:13b' etc.)
'stream' => false,
'options' => [
'temperature' => 0.7, # Default -- will be overwritten by NppOpenAI.ini
'top_k ' => 40, # Default value
'top_p' => 0.9, # Default -- will be overwritten by NppOpenAI.ini
// For additional options see: https://github.com/ollama/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values
],
];
// Let's get started
try {
// Check PHP Input
$input = json_decode(file_get_contents('php://input'), true);
if (!$input)
{
throw new \Exception("Non-JSON input received");
}
// Check system/prompt related elements
$is_system_message = (@$input['messages'][0]['role'] == 'system');
if (!isset($input['messages'][0]['content']))
{
throw new \Exception("No message received");
}
if ($is_system_message && !isset($input['messages'][1]['content']))
{
throw new \Exception("No message received, only instructions (system message)");
}
// Add system message
if ($is_system_message)
{
$postfields['system'] = $input['messages'][0]['content'];
}
// Add prompt
$postfields['prompt'] = !$is_system_message
? $input['messages'][0]['content']
: $input['messages'][1]['content'];
// Update some options, if possible
// $postfields['model'] = $input['model'] ?? $postfields['model']; # Use the model above to support system messages
$postfields['options']['temperature'] = $input['temperature'] ?? $postfields['options']['temperature'];
$postfields['options']['top_p'] = $input['top_p'] ?? $postfields['options']['top_p'];
// Call Ollama
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $ollama_url);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0); # OK on localhost
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0); # OK on localhost
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1); # It may come in handy
curl_setopt($ch, CURLOPT_MAXREDIRS, 10); # It may come in handy
curl_setopt($ch, CURLOPT_TIMEOUT, 60); # Increase if necessary
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); # Required for output
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, json_encode($postfields));
$curl_data = curl_exec($ch);
$curl_errno = curl_errno($ch);
if ($curl_errno)
{
$curl_err = curl_error($ch) ?: $curl_errno;
curl_close($ch);
throw new \Exception("CURL error: {$curl_err}");
}
// Handle Ollama's response
$response = json_decode($curl_data, true);
if (!$response)
{
throw new \Exception("Non-JSON response received: {$curl_data}");
}
if (!isset($response['response']))
{
throw new \Exception("Missing response; Ollama's answer: " . print_r($response, true));
}
// Convert/Print output
$output = [
'usage' => [
'total_tokens' => (int)@$response['prompt_eval_count'] + (int)@$response['eval_count'], # Total token usage
],
'choices' => [ [
'message' => [
'content' => $response['response'],
'finish_reason' => 'stop',
],
] ],
];
echo json_encode($output);
// Handle errors
} catch (Exception $e) {
echo json_encode([
'error' => [
'message' => $e->getMessage(),
],
]);
}
?> You may install a WAMPServer and save the script above to Then you should update the API URL in
Please leave the After saving the The script above should work, but it has been tested in simulated environment only, without Ollama. If you have any question, feel free to write a new Comment. |
Thank you very much for your reply! |
Does this plugin support Ollama?
I attempted to modify the URL to http://localhost:11434/v1/chat/completions, but it failed to function.
What should I do? Thank you immensely.
The text was updated successfully, but these errors were encountered: