Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question abuout parameter in code #22

Closed
airship-explorer opened this issue Dec 5, 2017 · 2 comments
Closed

Question abuout parameter in code #22

airship-explorer opened this issue Dec 5, 2017 · 2 comments

Comments

@airship-explorer
Copy link

airship-explorer commented Dec 5, 2017

I want to ask a question about parameter- shrink_embedding in train_w.py, If I don't use external resources, this parameter can set False, it doesn't affect the final result? The other parameter fine_tune, I can't totally understand the meaning, can you explain it for me? Thank you very much!

if args.fine_tune:              # which means does not do fine-tune  
        f_map = {'<eof>': 0}

i think this code may be wrong, if I choose fine_tune=Ture, will process this code, then can't fune pre-trained embedding dictionary

@LiyuanLucasLiu
Copy link
Owner

Yes, if you don not use pre-trained embeddings, enable --shrink_embedding should not affect the final result.

If --fine_tune is enabled, the resulting word dictionary would contain not only words with pre-trained embedding, but also frequent words in the training corpus (which not in the dictionary of pre-trained embedding)

However, you are encouraged to use pre-trained embeddings for better performance.

@airship-explorer
Copy link
Author

Thank you very much

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants