Multilingual Automatic Speech Recognition with word-level timestamps and confidence
-
Updated
Apr 22, 2024 - Python
Multilingual Automatic Speech Recognition with word-level timestamps and confidence
TF2 Deep FloorPlan Recognition using a Multi-task Network with Room-boundary-Guided Attention. Enable tensorboard, quantization, flask, tflite, docker, github actions and google colab.
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Pytorch Implementation of "Adaptive Co-attention Network for Named Entity Recognition in Tweets" (AAAI 2018)
[TMI 2019] Attention to Lesion: Lesion-Aware Convolutional Neural Network for Retinal Optical Coherence Tomography Image Classification
RSANet: Recurrent Slice-wise Attention Network for Multiple Sclerosis Lesion Segmentation (MICCAI 2019)
Speech recognition model for recognising Macedonian spoken language.
[CoRL 2023] Context-Aware Deep Reinforcement Learning for Autonomous Robotic Navigation in Unknown Area - - Public code and model
Using attention network to extend image quality assessment algorithms for video quality assessment
Sequence 2 Sequence with Attention Mechanisms in Tensorflow v2
This is the official source code of our IEA/AIE 2021 paper
Gated-ViGAT. Code and data for our paper: N. Gkalelis, D. Daskalakis, V. Mezaris, "Gated-ViGAT: Efficient bottom-up event recognition and explanation using a new frame selection policy and gating mechanism", IEEE International Symposium on Multimedia (ISM), Naples, Italy, Dec. 2022.
A TensorFlow 2.0 Implementation of the Transformer: Attention Is All You Need
High Dynamic Range Image Synthesis via Attention Non-Local Network
Python 3 supported version for DySAT
An implementation of Transformer Networks using Chainer
A customized version of the Relational Aware Graph Attention Network for large scale EHR records.
locality-aware invariant Point Attention-based RNA ScorEr
Efficient Visual Tracking with Stacked Channel-Spatial Attention Learning
Deep learning model for non-coding regulatory variants
Add a description, image, and links to the attention-network topic page so that developers can more easily learn about it.
To associate your repository with the attention-network topic, visit your repo's landing page and select "manage topics."