You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Description
As of now, the Self-Attention Mechanism is using a simple Cosine Similarity/Dot Product Similarity Algorithm with a selectivity threshold. This is proven to be very inaccurate and cannot be reverse searched as the incorrect selections will invalidate the integrity of the database.
To Reproduce
To reproduce the behavior:
fromMAGIST.NLP.SelfAttentionimportTextPreprocessingt=TextPreprocessing("config.json")
out=t.__call__("Hello, my name is John. I am a dummy script.")
foriinout:
print(i)
Expected behavior
A clear and concise description of what you expected to happen.
Additional context
This was expected since this algorithm is very primitive. Perhaps, a better positional embedding or an end-to-end LSTM-Dense neural network would improve its performance.
The text was updated successfully, but these errors were encountered:
Description
As of now, the Self-Attention Mechanism is using a simple Cosine Similarity/Dot Product Similarity Algorithm with a selectivity threshold. This is proven to be very inaccurate and cannot be reverse searched as the incorrect selections will invalidate the integrity of the database.
To Reproduce
To reproduce the behavior:
Output
Expected behavior
A clear and concise description of what you expected to happen.
Additional context
This was expected since this algorithm is very primitive. Perhaps, a better positional embedding or an end-to-end LSTM-Dense neural network would improve its performance.
The text was updated successfully, but these errors were encountered: