Starred repositories
Example ๐ Jupyter notebooks that demonstrate how to build, train, and deploy machine learning models using ๐ง Amazon SageMaker.
Democratizing Reinforcement Learning for LLMs
Fair-code workflow automation platform with native AI capabilities. Combine visual building with custom code, self-host or cloud, 400+ integrations.
Witness the aha moment of VLM with less than $3.
ZenML ๐: The bridge between ML and Ops. https://zenml.io.
Attention is all you need implementation
A course on aligning smol models.
๐ OpenHands: Code Less, Make More
Notes from the Latent Space paper club. Follow along or start your own!
This repo has the code of the 3 demos I presented at Google Gemma2 DevDay Tokyo, using Gemma2 on a Jetson Orin Nano device.
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, and more.
Build Multimodal AI Agents with memory, knowledge and tools. Simple, fast and model-agnostic.
[ICLR 2023] ReAct: Synergizing Reasoning and Acting in Language Models
Langflow is a low-code app builder for RAG and multi-agent AI applications. Itโs Python-based and agnostic to any model, API, or database.
Examples and guides for using the Gemini API
Finetune Llama 3.3, DeepSeek-R1 & Reasoning LLMs 2x faster with 70% less memory! ๐ฆฅ
Sample code and notebooks for Generative AI on Google Cloud, with Gemini on Vertex AI
https://huyenchip.com/ml-interviews-book/
Curated list of data science interview questions and answers
Distributed Asynchronous Hyperparameter Optimization in Python
A curated list of Large Language Model resources, covering model training, serving, fine-tuning, and building LLM applications.
This repository contains demos I made with the Transformers library by HuggingFace.
Simple, unified interface to multiple Generative AI providers
Python Fire is a library for automatically generating command line interfaces (CLIs) from absolutely any Python object.
This is a repo with links to everything you'd ever want to learn about data engineering
๐ค ๐๐ฒ๐ฎ๐ฟ๐ป for ๐ณ๐ฟ๐ฒ๐ฒ how to ๐ฏ๐๐ถ๐น๐ฑ an end-to-end ๐ฝ๐ฟ๐ผ๐ฑ๐๐ฐ๐๐ถ๐ผ๐ป-๐ฟ๐ฒ๐ฎ๐ฑ๐ ๐๐๐ & ๐ฅ๐๐ ๐๐๐๐๐ฒ๐บ using ๐๐๐ ๐ข๐ฝ๐ best practices: ~ ๐ด๐ฐ๐ถ๐ณ๐ค๐ฆ ๐ค๐ฐ๐ฅ๐ฆ + 12 ๐ฉ๐ข๐ฏ๐ฅ๐ด-๐ฐ๐ฏ ๐ญ๐ฆ๐ด๐ด๐ฐ๐ฏ๐ด