Skip to content

This library uses genetic algorithms to for automated hyper-parameter optimization in Machine learning algorithms. this was done as my ICS674 evolutionary computation project

License

DeepsMoseli/geneticml

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

30 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

geneticml

About

This library uses genetic algorithms to for automated hyper-parameter optimization in Machine learning algorithms

Supported Models

Currently supports

  • RandomForestClassifier, GradientBoostingClassifier, LogisticRegression, MLPClassifier.
  • DecisionTreeRegressor, RandomForestRegressor, GradientBoostingRegressor

__coming soon__ * xgboost, lightboost, catboost.

Usage

pip install -i https://test.pypi.org/simple/ geneticml

in your python script or ipynb

from  geneticml.variation_operators import differential_evolution 
from sklearn.ensemble import RandomForestClassifier

#sample data
from sklearn.datasets import load_breast_cancer
from sklearn.model_selection import train_test_split

load your dataset in the X, y form. you can split your train and test data, but the code will create a validation set to test the evolution candidate's fitness.

data =  load_breast_cancer()
X = data.data
y = data.target

x_train,x_test,y_train,y_test = train_test_split(X,y,test_size=0.25,random_state=45)

create the EA object

test = differential_evolution(x_train,
	y_train, 
	RandomForestClassifier, 
	improvement = 0.1, 
	population_size=10,
	mutation_prob=0.13,
	elitism=0.15,
	crossover_prob=0.70,
	max_gen = 20)

run EA search (might take time depending on dataset size)

test.Main()

#best model
test.best

To use the best fitted model on your test set and measure AUC score

test_pred = test.best['best_fitted_model'].predict_proba(x_test)[:,1]
roc_auc_score(y_test,test_pred)

One plus one EA with Gradient boosting

from  geneticml.variation_operators import one_plus_one
from sklearn.ensemble import GradientBoostingClassifier

test2 = one_plus_one(x_train,
	y_train, 
	GradientBoostingClassifier, 
	improvement = 0.1, 
	mutation_prob = 0.9, 
	max_gen = 20,
	email=False)

test2.Main()

#best model
test2.best

About

This library uses genetic algorithms to for automated hyper-parameter optimization in Machine learning algorithms. this was done as my ICS674 evolutionary computation project

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages