Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Transformers #30

Open
kitagrawal opened this issue Aug 26, 2019 · 7 comments
Open

Transformers #30

kitagrawal opened this issue Aug 26, 2019 · 7 comments
Labels
enhancement New feature or request

Comments

@kitagrawal
Copy link

Maybe create a separate section on Transformers (in your to-do list). Recently, they have been getting a lot of attention.

@alvations
Copy link

@ankit--agrawal "attention", nice pun =)

@kitagrawal
Copy link
Author

:-P
There are some good blog posts on the topic but I will be particularly interested to know some good resources (lecture series) on this topic.

@kmario23
Copy link
Owner

kmario23 commented Sep 5, 2019

Hey @ankit--agrawal ,
thanks for your suggestion! Since this is a specialized topic, let's maintain it in this thread, at least for now.

Here is a preliminary list of lectures:

Please feel free to suggest if I've overlooked any worthwhile lectures!!

@kitagrawal
Copy link
Author

Thank you so much for these. I will update the thread if I find something worthwhile. :)

@kmario23 kmario23 added the enhancement New feature or request label Nov 10, 2019
@georgezoto
Copy link

Could you please make the Transformer list of lecture available on the main page?

Also how about adding content from the original attention papers:

  1. Neural Machine Translation by Jointly Learning to Align and Translate
    https://arxiv.org/pdf/1409.0473.pdf

  2. Show, Attend and Tell: Neural Image Caption Generation with Visual Attention
    https://arxiv.org/pdf/1502.03044.pdf

Thanks in advance and let us know how we can lep further,
George

@kmario23
Copy link
Owner

Could you please make the Transformer list of lecture available on the main page?

This is a nice suggestion! I've been thinking of a neat way to add it on the main page.

Also how about adding content from the original attention papers:

  1. Neural Machine Translation by Jointly Learning to Align and Translate
    https://arxiv.org/pdf/1409.0473.pdf
  2. Show, Attend and Tell: Neural Image Caption Generation with Visual Attention
    https://arxiv.org/pdf/1502.03044.pdf

I'm unsure about this since adding papers is not the goal of this repo!

Thanks in advance and let us know how we can lep further,
Contributions & suggestions are always welcome :)
George

@georgezoto
Copy link

I understand, there are plenty of lecture series on Transformers and blog posts that break down the original paper. I respect the framework you have chosen and I look forward your neat idea.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants