Skip to content

Liquid4All/cookbook

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 

Repository files navigation

Liquid AI Cookbook

Build with LFM2 Models and the LEAP SDK

🌊 Documentation   |   🤗 Hugging Face   |   🚀 LEAP Edge SDK   |   📚 Tutorials   |   🏗️ Community Examples

Join Discord  


Welcome dear developer!

This repository contains examples, tutorials, and applications for building with Liquid AI open-weight models and the open-source LEAP SDK.

Whether you're looking to fine-tune models, deploy to edge devices, or build complete applications, you'll find resources here to get started.

What are you looking for?

Fine-Tune an LFM2 model

LFM2 (Text-to-text)

LFM2 is a generation of hybrid models, designed for on-device deployment, ranging from 350M up to 8B parameters.

These models are particularly suited for agentic tasks, data extraction, RAG, creative writing, and multi-turn conversations. We do not recommend using them for tasks that are knowledge-intensive or require programming skills.

Model Technique
LFM2-8B-A1B Supervised Fine Tuning (TRL) Open In Colab
Direct Preference Optimization (TRL) Open In Colab
LFM2-2.6B Supervised Fine Tuning (TRL) Open In Colab
Supervised Fine Tuning (Axolotl) Open In Colab
Supervised Fine Tuning (Unsloth) Open In Colab
Direct Preference Optimization (TRL) Open In Colab
LFM2-1.2B Supervised Fine Tuning (TRL) Open In Colab
Supervised Fine Tuning (Axolotl) Open In Colab
Supervised Fine Tuning (Unsloth) Open In Colab
Direct Preference Optimization (TRL) Open In Colab
LFM2-700M Supervised Fine Tuning (TRL) Open In Colab
Supervised Fine Tuning (Axolotl) Open In Colab
Supervised Fine Tuning (Unsloth) Open In Colab
Direct Preference Optimization (TRL) Open In Colab
LFM2-350M Supervised Fine Tuning (TRL) Open In Colab
Supervised Fine Tuning (Axolotl) Open In Colab
Supervised Fine Tuning (Unsloth) Open In Colab
Direct Preference Optimization (TRL) Open In Colab

Need a model for data extraction, RAG, tool use, or math reasoning? Start with our Nano checkpoints—they're already specialized for these tasks.

Model Use Cases
LFM2-1.2B-Extract
LFM2-350M-Extract
• Extracting invoice details from emails into structured JSON
• Converting regulatory filings into XML for compliance systems
• Transforming customer support tickets into YAML for analytics pipelines
• Populating knowledge graphs with entities and attributes from unstructured reports
LFM2-1.2B-RAG • Chatbot to ask questions about the documentation of a particular product.
• Custom support with an internal knowledge base to provide grounded answers.
• Academic research assistant with multi-turn conversations about research papers and course materials.
LFM2-1.2B-Tool • Mobile and edge devices requiring instant API calls, database queries, or system integrations without cloud dependency.
• Real-time assistants in cars, IoT devices, or customer support, where response latency is critical.
• Resource-constrained environments like embedded systems or battery-powered devices needing efficient tool execution.
LFM2‑350M‑Math • Mathematical problem solving.
• Reasoning tasks.

Note

The supported languages for these models are: English, Arabic, Chinese, French, German, Japanese, Korean, Portuguese, and Spanish.

Need support for another language?

Join the Liquid AI Discord Community and request it! Our community is working on expanding language support, and your input helps us prioritize which languages to tackle next. Connect with fellow developers, share your use cases, and collaborate on multilingual AI solutions.

Join Discord

LFM2-VL (Text+Image to Text)

LFM2-VL is our first series of vision-language models, designed for on-device deployment.

Model Technique
LFM2-VL-1.6B Supervised Fine Tuning (TRL) Open In Colab
LFM2-VL-450M Supervised Fine Tuning (TRL) Open In Colab

Deploy to an edge device

The LEAP Edge SDK is our native framework for running LFM2 models on mobile devices.

Written for Android (Kotlin) and iOS (Swift), the goal of the Edge SDK is to make Small Language Model deployment as easy as calling a cloud LLM API endpoint, for any app developer.

Platform Example
Android LeapChat: A simple chat-style app allowing the users to chat with the model ▶️ Go to the code
SloganApp: Single turn generation for marketing. The UI is implemented with Android Views. ▶️ Go to the code
ShareAI: Website summary generator ▶️ Go to the code
Recipe Generator: Structured output generation with the LEAP SDK ▶️ Go to the code
Visual Language Model example ▶️ Go to the code
iOS LeapChat: A comprehensive chat application demonstrating advanced LeapSDK features including real-time streaming, conversation management, and modern UI components. ▶️ Go to the code
LeapSloganExample: A simple SwiftUI app demonstrating basic LeapSDK integration for text generation. ▶️ Go to the code
Recipe Generator: Structured output generation ▶️ Go to the code
Audio demo: A SwiftUI app demonstrating audio input and output with the LeapSDK for on-device AI inference. ▶️ Go to the code

End-2-end Tutorials

Complete end-to-end tutorials that take you from setup to deployment.

Tutorial Repository
Super fast and accurate image classification on edge devices ▶️ Go to the repo GitHub Repo stars
Let's build a Chess game using small and local Large Language Models ▶️ Go to the repo GitHub Repo stars

Examples built by our community

Working applications that demonstrate Liquid models in action.

Project Repository
TranslatorLens: Building An Offline Translation Camera ▶️ Go to the repo GitHub Repo stars

Contributing

We welcome contributions!

  • Open a PR with a link to your project github repo in the Examples built by our community section.

Support

About

Examples, end-2-end tutorials and apps built using Liquid AI Foundational Models (LFM) and the LEAP SDK

Topics

Resources

Stars

Watchers

Forks