Skip to content

Files

Latest commit

 

History

History

localai

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 

LocalAI and OpenVINO

LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families and architectures. Does not require GPU. It is created and maintained by Ettore Di Giacinto.

In this tutorial we show how to prepare a model config and launch an OpenVINO LLM model with LocalAI in docker container.

Notebook contents

  • Prepare Docker
  • Prepare a model
  • Run the server
  • Send a client request

Installation instructions

This is a self-contained example that relies solely on its own code.
We recommend running the notebook in a virtual environment. You only need a Jupyter server to start. For details, please refer to Installation Guide.