Extend/Passing extra source tokens to seq2seq encoder (PyTorch)
-
Updated
Jan 4, 2018 - Python
Extend/Passing extra source tokens to seq2seq encoder (PyTorch)
Official PyTorch code for "BAM: Bottleneck Attention Module (BMVC2018)" and "CBAM: Convolutional Block Attention Module (ECCV2018)"
Question Answer Bot created using Gunthercox Corpus
Fake news detection on LIAR-PLUS dataset using traditional machine learning techniques and deep learning techniques. Used a normal LSTM network and also contextual attention (with justification) for deep learning techniques.
a bunch of code for training image captioning models with pytorch. model architectures are based on the show, attend & tell paper with different attention component implementations.
Reproducibility Challenge 2020 papers
Official implementation utilised on the paper: Disagreement attention: Let us agree to disagree on computed tomography segmentation
A Comprehensive Implementation of Transformers Architecture from Scratch
Accessing the Writing skills of a document/author by classifiying the statements and sentences into different classes based on the sequential learning using Pre-trained models from BERT. Rating the document based on the score obtained from the classes for each statement/sentence.
kaggle比赛代码及记录(进行中...)
Neural Machine Translation with attention mechanism
personal master degree thesis project
LAWNet
A deep learning model that achieves video super-resolution tasks with temporal and spatial attention in cascade
Codes for the short essay project for Mphil TAL by Yingjia Wan in 2023.
use keras build an encoder-decoder model with attention mechanism to generate a title from an abstract
Visual Question Answering (VQA) Model
This course examines the theoretical and applied problems of constructing and modelling systems, which aim to extract and represent the meaning of natural language sentences or of whole discourses, but drawing on contributions from the fields of linguistics, cognitive psychology, artificial intelligence and computing science.
Comparission of synthesizer versus vanilla minGPT transformer on Question answering task
Add a description, image, and links to the attention-mechanism topic page so that developers can more easily learn about it.
To associate your repository with the attention-mechanism topic, visit your repo's landing page and select "manage topics."