This repo is my implementation of the coding questions of assignment1 of Stanford University class:
- CS224n: Natural Language Processing with Deep Learning
In this assignment we have implemented the followings using python + numpy:
- Thomas Mikolov's word2vec algorithm, both skip-gram and cbow versions; both with softmax loss and negative sampling loss
- A sentiment analysis model using the embeddings obtained from word2vec.
Apart from the coding questions, this assignment also has many mathematical questions involving:
- Deriving word vectors' gradients for 4 different flavours of the word2vec model (skip-gram vs cbow; softmax loss vs negsample loss)
- Deriving the computational complexity of each flavour.
- You can find my (handwritten, correct and more detailed) solutions here: CS224n_written_sections_solutions
- junior Teudjio Mbativou : https://www.linkedin.com/in/junior-teudjio-3a125b8a
- A big thank you to Stanford university for putting this beautiful learning material open.
- A big thank you to Professors Richard Socher and Chris Manning for teaching the subject in such an intuitive and yet very deep and practical manner.
This project is licensed under the MIT License - see the LICENSE.md file for details