Notebooks for the HousePricing (https://www.kaggle.com/c/house-prices-advanced-regression-techniques).
Trying the fastai TabularLearner with some manual preprocessing, which results in a RMLSE of 0.1326. XGBoost performs way better on that (preprocssed) data with a RMLSE of 0.11775.
When ensembled (80/20,xgboost/nn) it reaches 0.11675 & Rank 486 (Top 10%)
To imporve: finetune the NN to reach a similar performance as the GradientBoost model.