Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace mention of one-hot encodings with embedding explanation #878

Closed
wants to merge 3 commits into from

Conversation

mariomeissner
Copy link

This tutorial mentions that we will encode the words as one-hot encoded vectors, but this does not happen anywhere in the code since we are using embedding layers. I removed the paragraph talking about one-hot encodings and replaced it with a short explanation of embeddings.

Let me know if I should rephrase it or leave out the embedding explanation altogether.

@netlify
Copy link

netlify bot commented Mar 8, 2020

Deploy preview for pytorch-tutorials-preview failed.

Built with commit 24d3848

https://app.netlify.com/sites/pytorch-tutorials-preview/deploys/5ecc9e6c01d2110006de2ac2

Base automatically changed from master to main February 16, 2021 19:33
Base automatically changed from main to master February 16, 2021 19:37
Copy link
Contributor

@zhangguanheng66 zhangguanheng66 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It uses the regular vocabulary to convert tokens into indexes and feed the indexes into Embedding layer.

@mariomeissner
Copy link
Author

Yes, indexes, but not in a one-hot-encoded representation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants