My solution that scored 0.42232 and finished Otto competition on 218th position out of 3514 teams.
Represents 1:1 blend of XGBoost model and average of 20 Neural Nets. Models hyper parameters, NN architecture and blend weights have been chosen manually.
Requires:
Other Kagglers insights I found particularly interesting. For the most part they relate to blending. I list them here for further study:
-
Triskelion. Competition 62nd. Blending
forum link 1
forum link 2
Ensemble Selection from Libraries of Models
For his turn he is referring to another kaggler Emanuele Olivetti, (forked code) -
Hoang Duong. Competition 6th. Blending
forum link
documentation -
Adam Harasimowicz. Competition 66th. Blending, Hyperopt
forked code
Blog post -
Mike Kim. Competition 8th. T-SNE features and meta bagging
forum link
code