Skip to content

malikalessa/XENIA

Repository files navigation

XENIA

The repository contains code refered to the work:

An XAI-based adversarial training approach for cyber-threat detection [XENIA]

Malik AL-Essa, Giuseppina Andresini, Annalisa Appice, Donato Malerba

image

Cite this paper

M. AL-Essa, G. Andresini, A. Appice, D. Malerba, An xai-based adversarial training approach for cyber-threat detection, in: 2022 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Intl Conf on Pervasive Intelligence and Computing, Intl Conf on Cloud and Big Data Computing, Intl Conf on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech), 2022, pp. 1–8.

Code Requirements

Description for this repository

Two different types of datasets are used in this work CICICD17, and CIC-Maldroid20. MinMax scaler has been used to normalize the datasets. The datasets and models that have been used in work can be downloaded through

How to use

The implementation for all the experiments used in this work are listed in this repository.

  • main.py : to run XENIA

Replicate the Experiments

To replicate the experiments of this work, the models and datasets that have been saved in Datasets and Models can be used. Global Variable are saved in Conf.conf :

  • TRAIN_BASELINE = 0   #1 train baseline with hyperopt
  • CREATE_ADVERSARIAL_SET=0   #if 1 create the adversarial samples
  • Attack_Type =1   ## 1 for FGSM
  • train_Attack = 1   #0 not to train, 1 to train / To Train a model using adversarial training
  • local_shap_values = 1   #if 1 to compute local shap values, 0 to load the saved values
  • Config_model= 6   #if 2 for baseline(V1), 6 for T_A model (V2)
  • Fine_Tuning = 0   #if 1 To fine-tune V2 (Adversarial training model). The model will be fine-tuned twice, using XAI and T+A
  • Fine_Tuning_baseline = 0   ## 1 To fine tune the baseline(V1) model, The model will be fine tuned twice, using XAI and T

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages