Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reading: All-but-the-Top: Simple and Effective Postprocessing for Word Representations #51

Open
a1da4 opened this issue Dec 20, 2019 · 0 comments
Labels
Conf: ICLR International Conference on Learning Representations Representation

Comments

@a1da4
Copy link
Owner

a1da4 commented Dec 20, 2019

0. Paper

@inproceedings{
mu2018allbutthetop,
title={All-but-the-Top: Simple and Effective Postprocessing for Word Representations},
author={Jiaqi Mu and Pramod Viswanath},
booktitle={International Conference on Learning Representations},
year={2018},
url={https://openreview.net/forum?id=HkuGJ3kCb},
}
[paper]

My code [GitHub]

1. What is it?

They proposed an effective post-process method for word embeddings.

2. What is amazing compared to previous studies?

They claimed that there are many embedding methods and their performance are approximately same.
Moreover, the authors points 2 problems in PMI-based methods(word2vec, sppmi-svd).

  • not of zero-mean: any word w, its vector v(w) is not in center.
  • not isotopic: there are some bias in vector distribution.

3. Where is the key to technologies and techniques?

スクリーンショット 2019-12-21 0 05 03

D is a parameter, approximately d/100. d is a total dimension.

4. How did validate it?

They defined the way to calculate isotopy, I({v(w)}).

スクリーンショット 2019-12-21 0 20 19

スクリーンショット 2019-12-21 0 20 34

スクリーンショット 2019-12-21 0 22 47

They used Word2vec and Glove, their method improved the isotopy.

スクリーンショット 2019-12-21 0 23 54

Moreover, their methods improved word embedding tasks.

5. Is there a discussion?

6. Which paper should read next?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Conf: ICLR International Conference on Learning Representations Representation
Projects
None yet
Development

No branches or pull requests

1 participant