Extend/Passing extra source tokens to seq2seq encoder (PyTorch)
-
Updated
Jan 4, 2018 - Python
Extend/Passing extra source tokens to seq2seq encoder (PyTorch)
TensorFlow implementation of "Pointer Networks"
Official PyTorch code for "BAM: Bottleneck Attention Module (BMVC2018)" and "CBAM: Convolutional Block Attention Module (ECCV2018)"
Question Answer Bot created using Gunthercox Corpus
Fake news detection on LIAR-PLUS dataset using traditional machine learning techniques and deep learning techniques. Used a normal LSTM network and also contextual attention (with justification) for deep learning techniques.
Repository containing the code to my bachelor thesis about Neural Machine Translation
a bunch of code for training image captioning models with pytorch. model architectures are based on the show, attend & tell paper with different attention component implementations.
Implementation of various channel-wise attention modules
Reproducibility Challenge 2020 papers
Attention-based video classifier running on accelerated attention approximations
plot boundary delineation using unets collections models
Official implementation utilised on the paper: Disagreement attention: Let us agree to disagree on computed tomography segmentation
Implementation of Deep Vision Transformer in Flax
Exploring an idea where one forgets about efficiency and carries out attention across each edge of the nodes (tokens)
Gated Attention Unit (TensorFlow implementation)
Implementation of HydraPlusNet (HP-Net) with Weighted Loss for Pedestrian Analysis
Designed a sequence transduction model based on Recurrent, Convolutional Neural Networks and Attention mechanism
deep learning approach to pixel-wise classification by running it through the UNet encoder, then utilizing a combination of Convolutional Neural Networks (CNN) and an attention map to specifically observe the significant region of the ultrasonic image, and finally run it through the Unet decoder.
Accessing the Writing skills of a document/author by classifiying the statements and sentences into different classes based on the sequential learning using Pre-trained models from BERT. Rating the document based on the score obtained from the classes for each statement/sentence.
Add a description, image, and links to the attention-mechanism topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanism topic, visit your repo's landing page and select "manage topics."