Tuning XGBoost hyper-parameters with Simulated Annealing
-
Updated
Apr 26, 2017 - Jupyter Notebook
Tuning XGBoost hyper-parameters with Simulated Annealing
R codes for common Machine Learning Algorithms
DSG17 | International Machine Learning Competition from Deezer | The goal of this challenge was to predict whether the users of the test dataset listened to the first track of Deezer's own music recommendation algorithm proposed them or not.
Task provided
All codes, both created and optimized for best results from the SuperDataScience Course
This repository is for Fake News Detection using Deep Learning models
Machine Learning Project on Imbalanced Data in R
The python notebook is on googles new collabatory tool. Its a churn model being run on 3 different algorithms to compare.
Extreme Gradient Boosting (XGBoost) with R and H2o for Stroke Prediction
Instacart data set for analysis of various products grouped by department and customer likeliness. Prediction of products that might be bought together in customers' basket in the next purchase.
Ariba Code-A-Thon 2018
Predicting Behaviour of Bug Finding Tools KLEE and AFL
A python script that predicts a person's income based on other criteria
Machine learning tutorial with examples
Notebooks from my blog. meterdatascience.weebly.com
Add a description, image, and links to the xgboost-algorithm topic page so that developers can more easily learn about it.
To associate your repository with the xgboost-algorithm topic, visit your repo's landing page and select "manage topics."