SMILES GPT (SGPT) for chemical design through reinforcement leaerning
-
Updated
Jun 17, 2024 - Jupyter Notebook
SMILES GPT (SGPT) for chemical design through reinforcement leaerning
mychatgpt is a small and useful Python package that provides utils to create OpenAI's GPT conversational agents. This module allows users to have interactive chat with GPT models and keeps track of the chat history. Useful in Python projects as Copilot agent.
This presents an implementation of a Generative Pre-trained Transformer (GPT) model, focusing on comparing the performance and effectiveness of Kolmogorov-Arnold Networks (KANs) and traditional multilayer perceptron (MLP) layers within the architecture.
nanoGPT model from scratch
This repository serves as a collection of various artificial intelligence projects or experiments I have done, whether academic or personal. See README for more information.
ToyGPT, inspired by Andrej Karpathy’s GPT from scratch, creates a toy generative pre-trained transformer at its most basic level using a simple bigram language model with attention to help educate on the basics of creating an LLM from scratch.
Christopher, software simulatorio de un Chat GPT "Generative Pretrained Transformer"
Implementation of the decoder part of the transformer architecture for a generative pre_trained transformer(GPT)
Implementing GPT, Decoder, LSTM, Lora, Layer-Batch normalizations, LLMs, FFNN, Attention mechanism, Transformer
Add a description, image, and links to the generative-pretrained-transformer topic page so that developers can more easily learn about it.
To associate your repository with the generative-pretrained-transformer topic, visit your repo's landing page and select "manage topics."