Generative Pre-trained Transformer in PyTorch
-
Updated
Jun 11, 2024 - Python
Generative Pre-trained Transformer in PyTorch
A PyTorch library for all things Reinforcement Learning (RL) for Combinatorial Optimization (CO)
PyTorch/XLA integration with JetStream (https://github.com/google/JetStream) for LLM inference"
Scenic: A Jax Library for Computer Vision Research and Beyond
Pytorch implementation of various token mixers; Attention Mechanisms, MLP, and etc for understanding computer vision papers and other tasks.
Reference implementation of "Softmax Attention with Constant Cost per Token" (Heinsen, 2024)
Alignment-Free RGBT Salient Object Detection: Semantics-guided Asymmetric Correlation Network and A Unified Benchmark
A collection of memory efficient attention operators implemented in the Triton language.
[ICML 2024] Outlier-Efficient Hopfield Layers for Large Transformer-Based Models
Attention-based Adaptive filter designing for keyword classification
Official implementation of our paper "Diving Deep into Regions: Exploiting Regional Information Transformer for Single Image Deraining."
Official Implementation of SEA: Sparse Linear Attention with Estimated Attention Mask (ICLR 2024)
Keras beit,caformer,CMT,CoAtNet,convnext,davit,dino,efficientdet,edgenext,efficientformer,efficientnet,eva,fasternet,fastervit,fastvit,flexivit,gcvit,ghostnet,gpvit,hornet,hiera,iformer,inceptionnext,lcnet,levit,maxvit,mobilevit,moganet,nat,nfnets,pvt,swin,tinynet,tinyvit,uniformer,volo,vanillanet,yolor,yolov7,yolov8,yolox,gpt2,llama2, alias kecam
[ICLR 2024] AGILE3D: Attention Guided Interactive Multi-object 3D Segmentation
A Jax-based library for designing and training transformer models from scratch.
Project Name: AdaViT | PyTorch Lightning, Python
The official repo for [IJCAI'24] "LeMeViT: Efficient Vision Transformer with Learnable Meta Tokens for Remote Sensing Image Interpretation"
Investigate possibilities for Vision Transformers with multiscale grids
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."