Skip to content

Latest commit

 

History

History
30 lines (23 loc) · 2.07 KB

README.md

File metadata and controls

30 lines (23 loc) · 2.07 KB

Kaggle Otto Group Product Classification Challenge

My solution that scored 0.42232 and finished Otto competition on 218th position out of 3514 teams.
Represents 1:1 blend of XGBoost model and average of 20 Neural Nets. Models hyper parameters, NN architecture and blend weights have been chosen manually.

Requires:

Other work

Other Kagglers insights I found particularly interesting. For the most part they relate to blending. I list them here for further study:

  1. Triskelion. Competition 62nd. Blending
    forum link 1
    forum link 2
    Ensemble Selection from Libraries of Models
    For his turn he is referring to another kaggler Emanuele Olivetti, (forked code)

  2. Hoang Duong. Competition 6th. Blending
    forum link
    documentation

  3. Adam Harasimowicz. Competition 66th. Blending, Hyperopt
    forked code
    Blog post

  4. Mike Kim. Competition 8th. T-SNE features and meta bagging
    forum link
    code