Notebooks for running and visualizing results using trained models for linguistic complexity.
-
Updated
Feb 6, 2019 - Python
Notebooks for running and visualizing results using trained models for linguistic complexity.
The basic notebook fot implementing the attention.
NLP学习笔记的Notebook,包含经典模型的理解与相关实践。
In this notebook, we look at how attention is implemented. We will focus on implementing attention in isolation from a larger mode
Snippets of nlp frameworks
Notes on ML and DL with jupyter notebooks (python)
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
The notebook explains the various steps to obtain the results of publication: "Is Space-Time Attention All You Need for Video Understanding?"
Assignments and lab notebooks of NLP Specialization by DeepLearning.ai
Deep learning research implemented on notebooks using PyTorch.
Exercise notebooks for CVND, the Udacity Computer Vision Nanodegree.
This collection of notebooks is based on the Dive into Deep Learning Book. All of the notes are written in Pytorch and the d2l/torch library
QuillGPT is an implementation of the GPT decoder block based on the architecture from Attention is All You Need paper by Vaswani et. al. in PyTorch. Additionally, this repository contains two pre-trained models — Shakespearean GPT and Harpoon GPT, a Streamlit Playground, Containerized FastAPI Microservice, training - inference scripts & notebooks.
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."