ICLR 2018 Quick-Thought vectors
-
Updated
Jul 15, 2019 - Python
ICLR 2018 Quick-Thought vectors
How to encode sentences in a high-dimensional vector space, a.k.a., sentence embedding.
Finding look alike sentences by leveraging the concept of semantic similarities pre-learned by transformer models while pre-training task. I've used cosine similarity as an angular distance matrix applied over sent2vec.
An approach to improve word sense induction systems (WSI) for web search result clustering. Exploring the boundaries of vector space models for the WSI Task. CHERTOY system. Chernenko, Tatjana and Toyota, Utaemon Institute for Computational Linguistics of University Heidelberg. 2017/2018
Add a description, image, and links to the sent2vec topic page so that developers can more easily learn about it.
To associate your repository with the sent2vec topic, visit your repo's landing page and select "manage topics."