You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
What confuses me is that it seems that you don't set an attention layer, which is mentioned in the paper, in the RobustLog model. Do you mind explaining the reason for me? I'm a ML/DL newbie. Thanks in advance!
The text was updated successfully, but these errors were encountered:
In my experiment(both loganomaly and robustlog), the attention mechanism(seems popular in papers in the nlp field) has little effect on the experimental results.
You can try it, if it works welcome make a pull request!
I will update the code with an attention layer later :)
Hi, thank you for this awesome toolkit!
What confuses me is that it seems that you don't set an attention layer, which is mentioned in the paper, in the
RobustLog
model. Do you mind explaining the reason for me? I'm a ML/DL newbie. Thanks in advance!The text was updated successfully, but these errors were encountered: