You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I want to ask a question about parameter- shrink_embedding in train_w.py, If I don't use external resources, this parameter can set False, it doesn't affect the final result? The other parameter fine_tune, I can't totally understand the meaning, can you explain it for me? Thank you very much!
if args.fine_tune: # which means does not do fine-tune
f_map = {'<eof>': 0}
i think this code may be wrong, if I choose fine_tune=Ture, will process this code, then can't fune pre-trained embedding dictionary
The text was updated successfully, but these errors were encountered:
Yes, if you don not use pre-trained embeddings, enable --shrink_embedding should not affect the final result.
If --fine_tune is enabled, the resulting word dictionary would contain not only words with pre-trained embedding, but also frequent words in the training corpus (which not in the dictionary of pre-trained embedding)
However, you are encouraged to use pre-trained embeddings for better performance.
I want to ask a question about parameter-
shrink_embedding
intrain_w.py
, If I don't use external resources, this parameter can set False, it doesn't affect the final result? The other parameterfine_tune
, I can't totally understand the meaning, can you explain it for me? Thank you very much!i think this code may be wrong, if I choose fine_tune=Ture, will process this code, then can't fune pre-trained embedding dictionary
The text was updated successfully, but these errors were encountered: