Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

https://eugeneyan.com/writing/attention/ #74

Open
utterances-bot opened this issue Aug 17, 2023 · 2 comments
Open

https://eugeneyan.com/writing/attention/ #74

utterances-bot opened this issue Aug 17, 2023 · 2 comments

Comments

@utterances-bot
Copy link

Some Intuition on Attention and the Transformer

What's the big deal, intuition on query-key-value vectors, multiple heads, multiple layers, and more.

https://eugeneyan.com/writing/attention/

Copy link

Wow, this was such an insightful dive into Transformers! I loved how you broke down the core concepts - your explanations really helped solidify my understanding.

The part about libraries and key-value pairs for queries was especially enlightening. I walked away feeling like I have a much stronger handle on how these models work now.

Thank you for putting this together! I would highly recommend your material to anyone looking to better understand Transformers. Keep up the great work!

@eugeneyan
Copy link
Owner

Wow thank you for the kind words! Your feedback encourages me to write more and help simplify such concepts 🙏

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants