Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Randomly initialising word vectors #32
There doesn't seem to be the option to initialise word vectors without using pretrained embeddings. There's an option to fill in vectors for tokens missing from the pretrained embeddings with normally distributed values. It would be cool if there was a built in option to initialise embeddings from a uniform distribution without having to specify a word embedding file.
No, you don’t need to put them there -- the only place where your embeddings actually need to be is in your model; TEXT.vocab.vectors offers a way to get pretrained vectors corresponding to your vocabulary and then use those to initialize your model's embeddings.