Skip to content

๐Ÿ”ฎ Instill Core is a full-stack AI infrastructure tool for data, model and pipeline orchestration, designed to streamline every aspect of building versatile AI-first applications

License

Notifications You must be signed in to change notification settings

instill-ai/instill-core

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿ”ฎ Instill Core

Integration Test GitHub release (latest SemVer including pre-releases) Artifact Hub Discord All Contributors

A complete unstructured data solution: ETL processing, AI-readiness, open-source LLM hosting, and RAG capabilities in one powerful platform.

Quick start

Follow the installation steps below or documentation for more details to build versatile AI applications locally.

What is Instill Core?

Instill Core is an end-to-end AI platform for data, pipeline and model orchestration.

๐Ÿ”ฎ Instill Core - The full-stack AI infrastructure tool

๐Ÿ”ฎ Instill Core simplifies infrastructure hassle and encompasses these core features:

  • ๐Ÿ’ง Pipeline: Quickly build versatile AI-first APIs or automated workflows.
  • โš—๏ธ Model: Deploy and monitor AI models without GPU infrastructure hassles.
  • ๐Ÿ’พ Artifact: Transform unstructured data (e.g., documents, images, audio, video) into AI-ready formats.
  • โš™๏ธ Component: Connect essential building blocks to construct powerful pipelines.

What can you build?

  • ๐Ÿ“– Parsing PDF Files to Markdown: Cookbook
  • ๐Ÿงฑ Generating Structured Outputs from LLMs: Cookbook & Tutorial
  • ๐Ÿ•ธ๏ธ Web scraping & Google Search with Structured Insights
  • ๐ŸŒฑ Instance segmentation on microscopic plant stomata images: Cookbook

See Examples for more!

Installation

Prerequisites

Operating System Requirements and Instructions
macOS or Linux Instill Core works natively
Windows โ€ข Use Windows Subsystem for Linux (WSL2)
โ€ข Install latest yq from GitHub Repository
โ€ข Install latest Docker Desktop and enable WSL2 integration (tutorial)
โ€ข (Optional) Install cuda-toolkit on WSL2 (NVIDIA tutorial)
All Systems โ€ข Docker Engine v25 or later
โ€ข Docker Compose v2 or later
โ€ข Install latest stable Docker and Docker Compose

Steps

Use stable release version

Execute the following commands to pull pre-built images with all the dependencies to launch:

$ git clone -b v0.51.0 https://github.com/instill-ai/instill-core.git && cd instill-core

# Launch all services
$ make all

Use the latest version for local development

Execute the following commands to build images with all the dependencies to launch:

$ git clone https://github.com/instill-ai/instill-core.git && cd instill-core

# Launch all services
$ make latest

Important

Code in the main branch tracks under-development progress towards the next release and may not work as expected. If you are looking for a stable alpha version, please use latest release.

๐Ÿš€ That's it! Once all the services are up with health status, the UI is ready to go at http://localhost:3000. Please find the default login credentials in the documentation.

To shut down all running services:

make down

Deployment

Visit the Deployment Overview for more details.

Client Access

Documentation

Please visit our official documentation for more.

Additional resources:

Contributing

We welcome contributions from our community! Checkout the methods below:

  1. Cookbooks: Help us create helpful pipelines and guides for the community. Visit our Cookbook repository to get started.

  2. Issues: Contribute to improvements by raising tickets using templates here or discuss in existing ones you think you can help with.

Community Standards

We are committed to maintaining a respectful and welcoming atmosphere for all contributors. Before contributing, please read:

Support

Get help by joining our Discord community where you can post any questions on our #ask-for-help channel.

License

See the LICENSE file for licensing information.