Collection of Jupyter Notebooks related to Generative AI.
-
Updated
Feb 13, 2024 - Jupyter Notebook
Collection of Jupyter Notebooks related to Generative AI.
Code refactoring using large language models in Jupyter notebooks
Prompts, notebooks, and tools for generative pre-trained transformers.
The repository contains the notebook as well as python implementations of laguange models, from basic naive implementations to Transformers on a simple name dataset. more notebooks and files will be added regularly
This python notebook lets you fine-tune the GPT model from OpenAI with your own data.
Implementing neural networks from scratch for a deeper understanding of concepts, featuring a Jupyter notebook with derivative-based implementations.
Python notebook for helping inspire and generate ideas.
QA With Jupyter NoteBook(.ipynb) powered by LangChain & Anthropic
Proof of concept to effortlessly refactor code, generate basic documentation, and create tests using GPT-4
A miniGPT inspired from the original NanoGPT released by OpenAI. This is a notebook to walk through the decoder part of the transformer architecture with details outlined.
This repository showcases the implementation of a chat model using Google Cloud's Vertex AI. The code is presented in a Jupyter Notebook environment and demonstrates how to set up, interact with, and customize a pre-trained chat model.
Image Captioning using ViT and GPT. Notebook version in the following link
This collection of notebooks is based on the Dive into Deep Learning Book. All of the notes are written in Pytorch and the d2l/torch library
Run Dolly, the world’s first truly open instruction-tuned LLM, with your own prompts on IPUs
QuillGPT is an implementation of the GPT decoder block based on the architecture from Attention is All You Need paper by Vaswani et. al. in PyTorch. Additionally, this repository contains two pre-trained models — Shakespearean GPT and Harpoon GPT, a Streamlit Playground, Containerized FastAPI Microservice, training - inference scripts & notebooks.
A notebook that runs GPT-Neo with low vram (6 gb) and cuda acceleration by loading it into gpu memory in smaller parts.
Notebooks on using transformers for sequential recommendation tasks
Codes, scripts, and notebooks on various aspects of transformer models.
Whisper2Summarize is an application that uses Whisper for audio processing and GPT for summarization. It generates summaries of audio transcripts quickly and accurately, making it ideal for a variety of use cases such as note-taking, research, and content creation.
Add a description, image, and links to the gpt topic page so that developers can more easily learn about it.
To associate your repository with the gpt topic, visit your repo's landing page and select "manage topics."