Tensorflow implementation of a 3D-CNN U-net with Grid Attention and DSV for pancreas segmentation trained on CT-82.
-
Updated
Jul 3, 2024 - Python
Tensorflow implementation of a 3D-CNN U-net with Grid Attention and DSV for pancreas segmentation trained on CT-82.
Pytorch implementation of the models RT-1-X and RT-2-X from the paper: "Open X-Embodiment: Robotic Learning Datasets and RT-X Models"
[CVPR 2024] "CFAT: Unleashing Triangular Windows for Image Super-resolution"
DeepFill v1/v2 with Contextual Attention and Gated Convolution, CVPR 2018, and ICCV 2019 Oral
A PyTorch library for all things Reinforcement Learning (RL) for Combinatorial Optimization (CO)
Implementation of the model "AudioFlamingo" from the paper: "Audio Flamingo: A Novel Audio Language Model with Few-Shot Learning and Dialogue Abilities"
Implementation of SelfExtend from the paper "LLM Maybe LongLM: Self-Extend LLM Context Window Without Tuning" from Pytorch and Zeta
Hyperspectral Unmixing via Dual Attention Convolutional Neural Networks | 基于双注意力卷积神经网络的高光谱图像解混
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Reference implementation of "Softmax Attention with Constant Cost per Token" (Heinsen, 2024)
ABAW6 (CVPR-W) We achieved second place in the valence arousal challenge of ABAW6
[FG 2024] "Audio-Visual Person Verification based on Recursive Fusion of Joint Cross-Attention"
Specifically built for the research proposal: Estimating sector attention index with deep learning methods : example of Chinese stock market, Jan. 4, 2024.
Multilingual Automatic Speech Recognition with word-level timestamps and confidence
Camouflaged Object Detection, CVPR 2020 (Oral)
Neural Machine Translator with Transformers. Implementation for "Attention Is All You Need" paper
A tiny version of GPT, built with PyTorch and trained on Shakespeare
Code to reproduce results of the Gigapixel Histopathological Image Analysis Using Attention-Based Neural Networks paper.
Add a description, image, and links to the attention-model topic page so that developers can more easily learn about it.
To associate your repository with the attention-model topic, visit your repo's landing page and select "manage topics."