A PyTorch library for all things Reinforcement Learning (RL) for Combinatorial Optimization (CO)
-
Updated
Jun 8, 2024 - Python
A PyTorch library for all things Reinforcement Learning (RL) for Combinatorial Optimization (CO)
🚀🚀🚀 A collection of some awesome public YOLO object detection series projects.
Pytorch implementation of various token mixers; Attention Mechanisms, MLP, and etc for understanding computer vision papers and other tasks.
PyTorch/XLA integration with JetStream (https://github.com/google/JetStream) for LLM inference"
Generative Pre-trained Transformer in PyTorch
QuillGPT is an implementation of the GPT decoder block based on the architecture from Attention is All You Need paper by Vaswani et. al. in PyTorch. Additionally, this repository contains two pre-trained models — Shakespearean GPT and Harpoon GPT, a Streamlit Playground, Containerized FastAPI Microservice, training - inference scripts & notebooks.
A compilation of the best multi-agent papers
Reference implementation of "Softmax Attention with Constant Cost per Token" (Heinsen, 2024)
A solid foundational understanding of XAI, primarily emphasizing how XAI methodologies can expose latent biases in datasets and reveal valuable insights.
Visualizing the attention of vision-language models
Unified-modal Salient Object Detection via Adaptive Prompt Learning
Alignment-Free RGBT Salient Object Detection: Semantics-guided Asymmetric Correlation Network and A Unified Benchmark
A collection of memory efficient attention operators implemented in the Triton language.
[ICML 2024] Outlier-Efficient Hopfield Layers for Large Transformer-Based Models
A Baby Llama model
Julia Implementation of Transformer models
Attention-based Adaptive filter designing for keyword classification
Official implementation of our paper "Diving Deep into Regions: Exploiting Regional Information Transformer for Single Image Deraining."
Official Implementation of SEA: Sparse Linear Attention with Estimated Attention Mask (ICLR 2024)
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."