Skip to content

ari-dasci/S-EvoPruneDeepTL

Repository files navigation

S-EvoPruneDeepTL

This is the official repository of EvoPruneDeepTL: An Evolutionary Pruning Model: for Transfer Learning based Deep Neural Networks

Code

The implementation of EvoPruneDeepTL is divided in the following folders:

  • EvoDeepTLPruning FC1 FC2: the folder contains the python files for the one layer approaches.
  • EvoDeepTLPruning Both: this folders contains the python files for the both layer approach.
  • CNN pruning methods: contains the implementation of the compared CNN pruning methods in the paper.
  • configs: contains the configuration files for each analyzed dataset in the paper.
  • convergence images: it contains the images for the convergence of some used datasets.

Execution

To execute the code presented above, it is only required:

Python >= 3.6, Keras >= 2.2.4

Then, given the previous folders and a dataset, the command is the following:

python3 main.py configs/configDataset[dataset].csv configGA[Consecutive].csv numberExecution

where:

  • dataset names the dataset to analyze.
  • the GA configuration could be the one used for the one layer approach, configGA.csv, or the both layer approach, named configGAConsecutive.csv.
  • numberExecution referes to the number of execution that we are carrying out.

Datasets

The used datasets in this paper can be downloaded from:

Results

EvoPruneDeepTL is able to optimize sparse layers using a genetic algorithm, giving a neural scheme as it is shown.

Image0

The following table shows the average results of EvoPruneDeepTL when the comparison is made against CNN pruning methods.

Image0

Moreover, we also show the results of our Feature Selection mechanism against the CNN pruning methods.

We have also made a comparison against other feature extractors to select the fittest one to our data:

Image0

Relevant classes and robustness

We have carried out an analysis of the ability of EvoPruneDeepTL to adapt to relevant classes and robustness.

Relevant classes

The first experiment is the elimination of a class for each dataset to check the importance of that class in the data.

Image0

The second experiment aggregates each class to the dataset until it is fully completed.

Image0

Robustness

We have used a novel metric called CKA (Centered Kernel Alignment) to check the robustness of the obtained pruned neural networks. This comparison has been done against the closes network (using Hamming distance as selection method) and a fully-connected network. The results shows the robustness of EvoPruneDeepTL as the second column compares similar netwworks and the fourth column shows the results from more different networks:

Image0

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages