Skip to content

Anomaly detection from OS logs using Transformers implemented with Pytorch.

Notifications You must be signed in to change notification settings

arunbaruah/Anomaly_Detection_Transformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

30 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Anomaly Detection from Logs using Transformer

Anomaly detection from logs using Transformers implemented with Pytorch.

A Pytorch implementation of log anomaly detection model using Transformer. Implementation is divided into Parsing and Anomaly Detection.

Architecture

Dataset

Currently, the dataset being used is extracted from an Elastic Search private repository. However, log datasets can be downloaded from loghub.

Parser

Parser implementation is based on Spell parser used to obtain structured representationg log entries. Actual base implementation can be found on logparser.

Anomaly Detection

Anomaly detection is based on unsupervised learning. Transformer-based models have proven to be effective in language generation. Similarly to generating words or letters, the model learns to generate next log entries.

Flow

The model is trained to learn the normal behavior of log entries, learning to generate normal log entries. A log entry is considered an anomaly if next log entry is not within top predicted log entries.

Testing

Actual base implementation can be found on harvardnlp.

References

Min Du, Feifei Li. Spell: Online Streaming Parsing of Large Unstructured System Logs. IEEE Transactions on Knowledge and Data Engineering (TKDE), 2018.

Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin. Attention Is All You Need. InAdvances in Neural Information Pro-cessing Systems, 2017.

About

Anomaly detection from OS logs using Transformers implemented with Pytorch.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages