Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems About RobustLog #4

Closed
rhanqtl opened this issue Apr 23, 2020 · 2 comments
Closed

Problems About RobustLog #4

rhanqtl opened this issue Apr 23, 2020 · 2 comments

Comments

@rhanqtl
Copy link

rhanqtl commented Apr 23, 2020

Hi, thank you for this awesome toolkit!

What confuses me is that it seems that you don't set an attention layer, which is mentioned in the paper, in the RobustLog model. Do you mind explaining the reason for me? I'm a ML/DL newbie. Thanks in advance!

@d0ng1ee
Copy link
Owner

d0ng1ee commented Apr 23, 2020

In my experiment(both loganomaly and robustlog), the attention mechanism(seems popular in papers in the nlp field) has little effect on the experimental results.
You can try it, if it works welcome make a pull request!
I will update the code with an attention layer later :)

@rhanqtl
Copy link
Author

rhanqtl commented Apr 24, 2020

Thank you for your reply!

@rhanqtl rhanqtl closed this as completed Apr 24, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants