You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Development of a production-ready vector quantization (VQ) layer in TensorFlow, based on the prototype developed in #25 and merged with #52 (+prototype 2).
Since L1, L2, and Inf norm seem to be problematic in high dimensional space, it might be worth considering to add an additional norm order, namely dotp. It would
normalize all input vectors to unit norm
initialize the embedding space with unit norm vectors
replace inputs with the vector from the embedding space to which the dot product is the greatest
My guess is that L1/L2/inf norm are just not good distance measures in high-dimensional space. Maybe we could try applying PCA or something to both activations and embeddings and then calculate the distance on their basis.
-> will be discussed in #56
Development of a production-ready vector quantization (VQ) layer in TensorFlow, based on the prototype developed in #25 and merged with #52 (+prototype 2).
Development branch:
vq-layer
Documentation
Sub-tasks
x
that were furthest away from embedding space vectorsis_training
parameter updates only during training (not needed anymore)scatter_update
call totf.GraphKeys.UPDATE_OPS
dotp
norm order VQ-Layer Cosine Distance #63The text was updated successfully, but these errors were encountered: